12755 1727204074.17474: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-twx executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 12755 1727204074.18066: Added group all to inventory 12755 1727204074.18069: Added group ungrouped to inventory 12755 1727204074.18074: Group all now contains ungrouped 12755 1727204074.18077: Examining possible inventory source: /tmp/network-6Zh/inventory-Sfc.yml 12755 1727204074.39505: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 12755 1727204074.39560: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 12755 1727204074.39581: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 12755 1727204074.39636: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 12755 1727204074.39700: Loaded config def from plugin (inventory/script) 12755 1727204074.39702: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 12755 1727204074.39738: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 12755 1727204074.39813: Loaded config def from plugin (inventory/yaml) 12755 1727204074.39817: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 12755 1727204074.39888: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 12755 1727204074.40258: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 12755 1727204074.40261: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 12755 1727204074.40263: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 12755 1727204074.40268: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 12755 1727204074.40271: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 12755 1727204074.40329: /tmp/network-6Zh/inventory-Sfc.yml was not parsable by auto 12755 1727204074.40386: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 12755 1727204074.40423: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 12755 1727204074.40495: group all already in inventory 12755 1727204074.40500: set inventory_file for managed-node1 12755 1727204074.40504: set inventory_dir for managed-node1 12755 1727204074.40504: Added host managed-node1 to inventory 12755 1727204074.40506: Added host managed-node1 to group all 12755 1727204074.40507: set ansible_host for managed-node1 12755 1727204074.40508: set ansible_ssh_extra_args for managed-node1 12755 1727204074.40510: set inventory_file for managed-node2 12755 1727204074.40512: set inventory_dir for managed-node2 12755 1727204074.40513: Added host managed-node2 to inventory 12755 1727204074.40514: Added host managed-node2 to group all 12755 1727204074.40514: set ansible_host for managed-node2 12755 1727204074.40517: set ansible_ssh_extra_args for managed-node2 12755 1727204074.40519: set inventory_file for managed-node3 12755 1727204074.40521: set inventory_dir for managed-node3 12755 1727204074.40521: Added host managed-node3 to inventory 12755 1727204074.40522: Added host managed-node3 to group all 12755 1727204074.40523: set ansible_host for managed-node3 12755 1727204074.40524: set ansible_ssh_extra_args for managed-node3 12755 1727204074.40526: Reconcile groups and hosts in inventory. 12755 1727204074.40529: Group ungrouped now contains managed-node1 12755 1727204074.40530: Group ungrouped now contains managed-node2 12755 1727204074.40532: Group ungrouped now contains managed-node3 12755 1727204074.40598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 12755 1727204074.40704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 12755 1727204074.40745: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 12755 1727204074.40768: Loaded config def from plugin (vars/host_group_vars) 12755 1727204074.40770: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 12755 1727204074.40776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 12755 1727204074.40784: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 12755 1727204074.40825: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 12755 1727204074.41098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204074.41179: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 12755 1727204074.41212: Loaded config def from plugin (connection/local) 12755 1727204074.41218: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 12755 1727204074.41750: Loaded config def from plugin (connection/paramiko_ssh) 12755 1727204074.41752: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 12755 1727204074.42495: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 12755 1727204074.42532: Loaded config def from plugin (connection/psrp) 12755 1727204074.42535: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 12755 1727204074.43228: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 12755 1727204074.43282: Loaded config def from plugin (connection/ssh) 12755 1727204074.43286: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 12755 1727204074.45528: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 12755 1727204074.45578: Loaded config def from plugin (connection/winrm) 12755 1727204074.45581: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 12755 1727204074.45612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 12755 1727204074.45669: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 12755 1727204074.45731: Loaded config def from plugin (shell/cmd) 12755 1727204074.45733: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 12755 1727204074.45757: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 12755 1727204074.45813: Loaded config def from plugin (shell/powershell) 12755 1727204074.45814: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 12755 1727204074.45865: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 12755 1727204074.46028: Loaded config def from plugin (shell/sh) 12755 1727204074.46030: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 12755 1727204074.46059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 12755 1727204074.46164: Loaded config def from plugin (become/runas) 12755 1727204074.46166: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 12755 1727204074.46327: Loaded config def from plugin (become/su) 12755 1727204074.46329: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 12755 1727204074.46464: Loaded config def from plugin (become/sudo) 12755 1727204074.46466: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 12755 1727204074.46499: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml 12755 1727204074.46781: in VariableManager get_vars() 12755 1727204074.46801: done with get_vars() 12755 1727204074.46925: trying /usr/local/lib/python3.12/site-packages/ansible/modules 12755 1727204074.50093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 12755 1727204074.50201: in VariableManager get_vars() 12755 1727204074.50207: done with get_vars() 12755 1727204074.50210: variable 'playbook_dir' from source: magic vars 12755 1727204074.50211: variable 'ansible_playbook_python' from source: magic vars 12755 1727204074.50212: variable 'ansible_config_file' from source: magic vars 12755 1727204074.50213: variable 'groups' from source: magic vars 12755 1727204074.50213: variable 'omit' from source: magic vars 12755 1727204074.50214: variable 'ansible_version' from source: magic vars 12755 1727204074.50217: variable 'ansible_check_mode' from source: magic vars 12755 1727204074.50218: variable 'ansible_diff_mode' from source: magic vars 12755 1727204074.50219: variable 'ansible_forks' from source: magic vars 12755 1727204074.50220: variable 'ansible_inventory_sources' from source: magic vars 12755 1727204074.50221: variable 'ansible_skip_tags' from source: magic vars 12755 1727204074.50222: variable 'ansible_limit' from source: magic vars 12755 1727204074.50223: variable 'ansible_run_tags' from source: magic vars 12755 1727204074.50223: variable 'ansible_verbosity' from source: magic vars 12755 1727204074.50259: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml 12755 1727204074.51285: in VariableManager get_vars() 12755 1727204074.51323: done with get_vars() 12755 1727204074.51335: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 12755 1727204074.52905: in VariableManager get_vars() 12755 1727204074.52926: done with get_vars() 12755 1727204074.52964: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12755 1727204074.53198: in VariableManager get_vars() 12755 1727204074.53219: done with get_vars() 12755 1727204074.53365: in VariableManager get_vars() 12755 1727204074.53376: done with get_vars() 12755 1727204074.53383: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12755 1727204074.53453: in VariableManager get_vars() 12755 1727204074.53466: done with get_vars() 12755 1727204074.53712: in VariableManager get_vars() 12755 1727204074.53726: done with get_vars() 12755 1727204074.53731: variable 'omit' from source: magic vars 12755 1727204074.53747: variable 'omit' from source: magic vars 12755 1727204074.53774: in VariableManager get_vars() 12755 1727204074.53784: done with get_vars() 12755 1727204074.53825: in VariableManager get_vars() 12755 1727204074.53838: done with get_vars() 12755 1727204074.53869: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12755 1727204074.54052: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12755 1727204074.54162: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12755 1727204074.56334: in VariableManager get_vars() 12755 1727204074.56361: done with get_vars() 12755 1727204074.56908: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 12755 1727204074.57095: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12755 1727204074.58934: in VariableManager get_vars() 12755 1727204074.58950: done with get_vars() 12755 1727204074.58957: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12755 1727204074.59037: in VariableManager get_vars() 12755 1727204074.59052: done with get_vars() 12755 1727204074.59158: in VariableManager get_vars() 12755 1727204074.59171: done with get_vars() 12755 1727204074.59403: in VariableManager get_vars() 12755 1727204074.59419: done with get_vars() 12755 1727204074.59424: variable 'omit' from source: magic vars 12755 1727204074.59433: variable 'omit' from source: magic vars 12755 1727204074.59569: variable 'controller_profile' from source: play vars 12755 1727204074.59611: in VariableManager get_vars() 12755 1727204074.59623: done with get_vars() 12755 1727204074.59642: in VariableManager get_vars() 12755 1727204074.59653: done with get_vars() 12755 1727204074.59680: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12755 1727204074.59788: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12755 1727204074.59852: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12755 1727204074.60172: in VariableManager get_vars() 12755 1727204074.60192: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12755 1727204074.62800: in VariableManager get_vars() 12755 1727204074.62818: done with get_vars() 12755 1727204074.62822: variable 'omit' from source: magic vars 12755 1727204074.62830: variable 'omit' from source: magic vars 12755 1727204074.62854: in VariableManager get_vars() 12755 1727204074.62866: done with get_vars() 12755 1727204074.62891: in VariableManager get_vars() 12755 1727204074.62907: done with get_vars() 12755 1727204074.62930: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12755 1727204074.63025: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12755 1727204074.63094: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12755 1727204074.63437: in VariableManager get_vars() 12755 1727204074.63457: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12755 1727204074.65610: in VariableManager get_vars() 12755 1727204074.65630: done with get_vars() 12755 1727204074.65634: variable 'omit' from source: magic vars 12755 1727204074.65643: variable 'omit' from source: magic vars 12755 1727204074.65671: in VariableManager get_vars() 12755 1727204074.65698: done with get_vars() 12755 1727204074.65715: in VariableManager get_vars() 12755 1727204074.65731: done with get_vars() 12755 1727204074.65753: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12755 1727204074.65847: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12755 1727204074.65913: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12755 1727204074.66297: in VariableManager get_vars() 12755 1727204074.66319: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12755 1727204074.69172: in VariableManager get_vars() 12755 1727204074.69210: done with get_vars() 12755 1727204074.69221: variable 'omit' from source: magic vars 12755 1727204074.69257: variable 'omit' from source: magic vars 12755 1727204074.69321: in VariableManager get_vars() 12755 1727204074.69356: done with get_vars() 12755 1727204074.69384: in VariableManager get_vars() 12755 1727204074.69415: done with get_vars() 12755 1727204074.69452: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12755 1727204074.69684: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12755 1727204074.69941: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12755 1727204074.70537: in VariableManager get_vars() 12755 1727204074.70572: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12755 1727204074.74034: in VariableManager get_vars() 12755 1727204074.74058: done with get_vars() 12755 1727204074.74065: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 12755 1727204074.74483: in VariableManager get_vars() 12755 1727204074.74509: done with get_vars() 12755 1727204074.74564: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 12755 1727204074.74576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 12755 1727204074.74779: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 12755 1727204074.74919: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 12755 1727204074.74922: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 12755 1727204074.74948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 12755 1727204074.74971: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 12755 1727204074.75115: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 12755 1727204074.75165: Loaded config def from plugin (callback/default) 12755 1727204074.75167: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12755 1727204074.76135: Loaded config def from plugin (callback/junit) 12755 1727204074.76137: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12755 1727204074.76178: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 12755 1727204074.76238: Loaded config def from plugin (callback/minimal) 12755 1727204074.76240: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12755 1727204074.76293: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12755 1727204074.76377: Loaded config def from plugin (callback/tree) 12755 1727204074.76381: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 12755 1727204074.76529: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 12755 1727204074.76532: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_removal_nm.yml ******************************************** 2 plays in /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml 12755 1727204074.76564: in VariableManager get_vars() 12755 1727204074.76578: done with get_vars() 12755 1727204074.76585: in VariableManager get_vars() 12755 1727204074.76597: done with get_vars() 12755 1727204074.76607: variable 'omit' from source: magic vars 12755 1727204074.76654: in VariableManager get_vars() 12755 1727204074.76670: done with get_vars() 12755 1727204074.76696: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_removal.yml' with nm as provider] ***** 12755 1727204074.77348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 12755 1727204074.77438: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 12755 1727204074.77708: getting the remaining hosts for this loop 12755 1727204074.77710: done getting the remaining hosts for this loop 12755 1727204074.77714: getting the next task for host managed-node1 12755 1727204074.77718: done getting next task for host managed-node1 12755 1727204074.77720: ^ task is: TASK: Gathering Facts 12755 1727204074.77722: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204074.77725: getting variables 12755 1727204074.77727: in VariableManager get_vars() 12755 1727204074.77738: Calling all_inventory to load vars for managed-node1 12755 1727204074.77741: Calling groups_inventory to load vars for managed-node1 12755 1727204074.77744: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204074.77759: Calling all_plugins_play to load vars for managed-node1 12755 1727204074.77773: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204074.77778: Calling groups_plugins_play to load vars for managed-node1 12755 1727204074.77825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204074.78000: done with get_vars() 12755 1727204074.78008: done getting variables 12755 1727204074.78082: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:6 Tuesday 24 September 2024 14:54:34 -0400 (0:00:00.016) 0:00:00.016 ***** 12755 1727204074.78109: entering _queue_task() for managed-node1/gather_facts 12755 1727204074.78111: Creating lock for gather_facts 12755 1727204074.78997: worker is 1 (out of 1 available) 12755 1727204074.79005: exiting _queue_task() for managed-node1/gather_facts 12755 1727204074.79021: done queuing things up, now waiting for results queue to drain 12755 1727204074.79023: waiting for pending results... 12755 1727204074.79080: running TaskExecutor() for managed-node1/TASK: Gathering Facts 12755 1727204074.79204: in run() - task 12b410aa-8751-72e9-1a19-0000000001bc 12755 1727204074.79230: variable 'ansible_search_path' from source: unknown 12755 1727204074.79278: calling self._execute() 12755 1727204074.79366: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204074.79380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204074.79398: variable 'omit' from source: magic vars 12755 1727204074.79528: variable 'omit' from source: magic vars 12755 1727204074.79568: variable 'omit' from source: magic vars 12755 1727204074.79625: variable 'omit' from source: magic vars 12755 1727204074.79685: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204074.79740: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204074.79767: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204074.79798: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204074.79821: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204074.79863: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204074.79872: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204074.79881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204074.80094: Set connection var ansible_connection to ssh 12755 1727204074.80097: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204074.80099: Set connection var ansible_shell_type to sh 12755 1727204074.80101: Set connection var ansible_timeout to 10 12755 1727204074.80103: Set connection var ansible_shell_executable to /bin/sh 12755 1727204074.80105: Set connection var ansible_pipelining to False 12755 1727204074.80107: variable 'ansible_shell_executable' from source: unknown 12755 1727204074.80108: variable 'ansible_connection' from source: unknown 12755 1727204074.80110: variable 'ansible_module_compression' from source: unknown 12755 1727204074.80112: variable 'ansible_shell_type' from source: unknown 12755 1727204074.80114: variable 'ansible_shell_executable' from source: unknown 12755 1727204074.80119: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204074.80234: variable 'ansible_pipelining' from source: unknown 12755 1727204074.80238: variable 'ansible_timeout' from source: unknown 12755 1727204074.80240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204074.80375: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204074.80397: variable 'omit' from source: magic vars 12755 1727204074.80407: starting attempt loop 12755 1727204074.80413: running the handler 12755 1727204074.80438: variable 'ansible_facts' from source: unknown 12755 1727204074.80467: _low_level_execute_command(): starting 12755 1727204074.80480: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204074.81408: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204074.81499: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204074.81522: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204074.81554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204074.81731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204074.83515: stdout chunk (state=3): >>>/root <<< 12755 1727204074.83705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204074.83794: stderr chunk (state=3): >>><<< 12755 1727204074.83805: stdout chunk (state=3): >>><<< 12755 1727204074.83841: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204074.83871: _low_level_execute_command(): starting 12755 1727204074.83881: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204074.8385658-12789-180993002939808 `" && echo ansible-tmp-1727204074.8385658-12789-180993002939808="` echo /root/.ansible/tmp/ansible-tmp-1727204074.8385658-12789-180993002939808 `" ) && sleep 0' 12755 1727204074.84688: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204074.84695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204074.84697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204074.84701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204074.84703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204074.84705: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204074.84707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204074.84709: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12755 1727204074.84720: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 12755 1727204074.84722: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12755 1727204074.84724: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204074.84726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204074.84728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204074.84730: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204074.84732: stderr chunk (state=3): >>>debug2: match found <<< 12755 1727204074.84735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204074.84851: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204074.84855: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204074.84857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204074.85025: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204074.87152: stdout chunk (state=3): >>>ansible-tmp-1727204074.8385658-12789-180993002939808=/root/.ansible/tmp/ansible-tmp-1727204074.8385658-12789-180993002939808 <<< 12755 1727204074.87386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204074.87392: stdout chunk (state=3): >>><<< 12755 1727204074.87401: stderr chunk (state=3): >>><<< 12755 1727204074.87468: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204074.8385658-12789-180993002939808=/root/.ansible/tmp/ansible-tmp-1727204074.8385658-12789-180993002939808 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204074.87472: variable 'ansible_module_compression' from source: unknown 12755 1727204074.87525: ANSIBALLZ: Using generic lock for ansible.legacy.setup 12755 1727204074.87530: ANSIBALLZ: Acquiring lock 12755 1727204074.87533: ANSIBALLZ: Lock acquired: 139630693732560 12755 1727204074.87536: ANSIBALLZ: Creating module 12755 1727204075.35853: ANSIBALLZ: Writing module into payload 12755 1727204075.35976: ANSIBALLZ: Writing module 12755 1727204075.36004: ANSIBALLZ: Renaming module 12755 1727204075.36010: ANSIBALLZ: Done creating module 12755 1727204075.36045: variable 'ansible_facts' from source: unknown 12755 1727204075.36052: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204075.36062: _low_level_execute_command(): starting 12755 1727204075.36069: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 12755 1727204075.36592: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204075.36597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204075.36599: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204075.36602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204075.36654: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204075.36658: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204075.36720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204075.38611: stdout chunk (state=3): >>>PLATFORM <<< 12755 1727204075.38683: stdout chunk (state=3): >>>Linux <<< 12755 1727204075.38698: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 <<< 12755 1727204075.38713: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 12755 1727204075.38865: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204075.38955: stderr chunk (state=3): >>><<< 12755 1727204075.38958: stdout chunk (state=3): >>><<< 12755 1727204075.38974: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204075.38988 [managed-node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 12755 1727204075.39046: _low_level_execute_command(): starting 12755 1727204075.39051: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 12755 1727204075.39192: Sending initial data 12755 1727204075.39195: Sent initial data (1181 bytes) 12755 1727204075.39661: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204075.39688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204075.39697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204075.39776: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204075.39787: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204075.39850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204075.43765: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} <<< 12755 1727204075.44040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204075.44101: stderr chunk (state=3): >>><<< 12755 1727204075.44105: stdout chunk (state=3): >>><<< 12755 1727204075.44114: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204075.44196: variable 'ansible_facts' from source: unknown 12755 1727204075.44200: variable 'ansible_facts' from source: unknown 12755 1727204075.44210: variable 'ansible_module_compression' from source: unknown 12755 1727204075.44245: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12755 1727204075.44272: variable 'ansible_facts' from source: unknown 12755 1727204075.44385: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204074.8385658-12789-180993002939808/AnsiballZ_setup.py 12755 1727204075.44520: Sending initial data 12755 1727204075.44523: Sent initial data (154 bytes) 12755 1727204075.44959: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204075.44998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204075.45002: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204075.45004: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 12755 1727204075.45007: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204075.45009: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204075.45076: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204075.45079: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204075.45120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204075.46868: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204075.46915: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204075.46956: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmppwlp0flx /root/.ansible/tmp/ansible-tmp-1727204074.8385658-12789-180993002939808/AnsiballZ_setup.py <<< 12755 1727204075.46960: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204074.8385658-12789-180993002939808/AnsiballZ_setup.py" <<< 12755 1727204075.47001: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmppwlp0flx" to remote "/root/.ansible/tmp/ansible-tmp-1727204074.8385658-12789-180993002939808/AnsiballZ_setup.py" <<< 12755 1727204075.47009: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204074.8385658-12789-180993002939808/AnsiballZ_setup.py" <<< 12755 1727204075.48737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204075.48813: stderr chunk (state=3): >>><<< 12755 1727204075.48819: stdout chunk (state=3): >>><<< 12755 1727204075.48841: done transferring module to remote 12755 1727204075.48855: _low_level_execute_command(): starting 12755 1727204075.48865: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204074.8385658-12789-180993002939808/ /root/.ansible/tmp/ansible-tmp-1727204074.8385658-12789-180993002939808/AnsiballZ_setup.py && sleep 0' 12755 1727204075.49351: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204075.49355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204075.49357: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204075.49360: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204075.49362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204075.49426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204075.49430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204075.49485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204075.51425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204075.51486: stderr chunk (state=3): >>><<< 12755 1727204075.51496: stdout chunk (state=3): >>><<< 12755 1727204075.51527: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204075.51530: _low_level_execute_command(): starting 12755 1727204075.51532: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204074.8385658-12789-180993002939808/AnsiballZ_setup.py && sleep 0' 12755 1727204075.52121: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204075.52125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204075.52131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204075.52134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204075.52184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204075.52206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204075.52271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204075.54550: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 12755 1727204075.54587: stdout chunk (state=3): >>>import _imp # builtin <<< 12755 1727204075.54629: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 12755 1727204075.54631: stdout chunk (state=3): >>>import '_weakref' # <<< 12755 1727204075.54701: stdout chunk (state=3): >>>import '_io' # <<< 12755 1727204075.54706: stdout chunk (state=3): >>>import 'marshal' # <<< 12755 1727204075.54771: stdout chunk (state=3): >>>import 'posix' # <<< 12755 1727204075.54794: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 12755 1727204075.54827: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 12755 1727204075.54887: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 12755 1727204075.54906: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 12755 1727204075.54932: stdout chunk (state=3): >>>import 'codecs' # <<< 12755 1727204075.54966: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 12755 1727204075.55011: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 12755 1727204075.55015: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f519840c4d0> <<< 12755 1727204075.55059: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51983dbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 12755 1727204075.55078: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f519840ea20> <<< 12755 1727204075.55085: stdout chunk (state=3): >>>import '_signal' # <<< 12755 1727204075.55135: stdout chunk (state=3): >>>import '_abc' # <<< 12755 1727204075.55141: stdout chunk (state=3): >>>import 'abc' # <<< 12755 1727204075.55157: stdout chunk (state=3): >>>import 'io' # <<< 12755 1727204075.55191: stdout chunk (state=3): >>>import '_stat' # <<< 12755 1727204075.55207: stdout chunk (state=3): >>>import 'stat' # <<< 12755 1727204075.55292: stdout chunk (state=3): >>>import '_collections_abc' # <<< 12755 1727204075.55327: stdout chunk (state=3): >>>import 'genericpath' # <<< 12755 1727204075.55333: stdout chunk (state=3): >>>import 'posixpath' # <<< 12755 1727204075.55374: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 12755 1727204075.55405: stdout chunk (state=3): >>>Processing user site-packages <<< 12755 1727204075.55425: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' <<< 12755 1727204075.55441: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 12755 1727204075.55448: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 12755 1727204075.55487: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 12755 1727204075.55502: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51981bd0a0> <<< 12755 1727204075.55564: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 12755 1727204075.55577: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 12755 1727204075.55581: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51981bdfd0> <<< 12755 1727204075.55615: stdout chunk (state=3): >>>import 'site' # <<< 12755 1727204075.55648: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 12755 1727204075.56053: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 12755 1727204075.56067: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 12755 1727204075.56103: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 12755 1727204075.56125: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 12755 1727204075.56142: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 12755 1727204075.56207: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 12755 1727204075.56223: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 12755 1727204075.56264: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 12755 1727204075.56294: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51981fbe00> <<< 12755 1727204075.56343: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 12755 1727204075.56374: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51981fbec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 12755 1727204075.56401: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 12755 1727204075.56563: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 12755 1727204075.56587: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198233800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198233e90> <<< 12755 1727204075.56602: stdout chunk (state=3): >>>import '_collections' # <<< 12755 1727204075.56639: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198213ad0> <<< 12755 1727204075.56657: stdout chunk (state=3): >>>import '_functools' # <<< 12755 1727204075.56687: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51982111f0> <<< 12755 1727204075.56899: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51981f8fb0> <<< 12755 1727204075.56906: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 12755 1727204075.56941: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 12755 1727204075.57011: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198257710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198256330> <<< 12755 1727204075.57121: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51982121e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51981faea0> <<< 12755 1727204075.57136: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198288740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51981f8230> <<< 12755 1727204075.57239: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5198288bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198288aa0> <<< 12755 1727204075.57271: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5198288e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51981f6d50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 12755 1727204075.57320: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198289550> <<< 12755 1727204075.57340: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198289220> import 'importlib.machinery' # <<< 12755 1727204075.57368: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 12755 1727204075.57453: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f519828a450> import 'importlib.util' # <<< 12755 1727204075.57456: stdout chunk (state=3): >>>import 'runpy' # <<< 12755 1727204075.57530: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 12755 1727204075.57578: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51982a4680> <<< 12755 1727204075.57655: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51982a5dc0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 12755 1727204075.57758: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51982a6cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51982a7320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51982a6210> <<< 12755 1727204075.57820: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 12755 1727204075.57884: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51982a7da0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51982a74d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f519828a4b0> <<< 12755 1727204075.57906: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 12755 1727204075.57916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 12755 1727204075.57926: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 12755 1727204075.57946: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 12755 1727204075.57997: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197fa3d10> <<< 12755 1727204075.58037: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 12755 1727204075.58044: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 12755 1727204075.58051: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197fcc710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197fcc470> <<< 12755 1727204075.58076: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197fcc740> <<< 12755 1727204075.58108: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197fcc920> <<< 12755 1727204075.58135: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197fa1eb0> <<< 12755 1727204075.58156: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 12755 1727204075.58286: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 12755 1727204075.58364: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 12755 1727204075.58397: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197fcdf70> <<< 12755 1727204075.58410: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197fccbf0> <<< 12755 1727204075.58413: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f519828aba0> <<< 12755 1727204075.58436: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 12755 1727204075.58496: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 12755 1727204075.58517: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 12755 1727204075.58568: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 12755 1727204075.58600: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197ffa2d0> <<< 12755 1727204075.58649: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 12755 1727204075.58669: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 12755 1727204075.58701: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 12755 1727204075.58728: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 12755 1727204075.58766: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51980123f0> <<< 12755 1727204075.58797: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 12755 1727204075.58834: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 12755 1727204075.58907: stdout chunk (state=3): >>>import 'ntpath' # <<< 12755 1727204075.58932: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f519804f1d0> <<< 12755 1727204075.58956: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 12755 1727204075.59008: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 12755 1727204075.59034: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 12755 1727204075.59068: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 12755 1727204075.59166: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198071970> <<< 12755 1727204075.59250: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f519804f2f0> <<< 12755 1727204075.59295: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198013080> <<< 12755 1727204075.59328: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197e982c0> <<< 12755 1727204075.59355: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198011430> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197fcee70> <<< 12755 1727204075.59540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 12755 1727204075.59553: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5198011550> <<< 12755 1727204075.59762: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_vpumxyeq/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 12755 1727204075.59927: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.59974: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 12755 1727204075.59978: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 12755 1727204075.60026: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 12755 1727204075.60154: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 12755 1727204075.60157: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197efe030> <<< 12755 1727204075.60160: stdout chunk (state=3): >>>import '_typing' # <<< 12755 1727204075.60572: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197ed4f20> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197e9bfb0> <<< 12755 1727204075.60575: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 12755 1727204075.62085: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.63441: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197ed7ec0> <<< 12755 1727204075.63488: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 12755 1727204075.63513: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 12755 1727204075.63561: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197f31af0> <<< 12755 1727204075.63607: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197f31880> <<< 12755 1727204075.63641: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197f31190> <<< 12755 1727204075.63698: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 12755 1727204075.63779: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197f315e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197efecc0> import 'atexit' # <<< 12755 1727204075.63782: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197f328a0> <<< 12755 1727204075.63801: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197f32ae0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 12755 1727204075.63858: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 12755 1727204075.63869: stdout chunk (state=3): >>>import '_locale' # <<< 12755 1727204075.63927: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197f32fc0> <<< 12755 1727204075.63931: stdout chunk (state=3): >>>import 'pwd' # <<< 12755 1727204075.63971: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 12755 1727204075.64018: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d94e00> <<< 12755 1727204075.64046: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197d96a20> <<< 12755 1727204075.64074: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 12755 1727204075.64110: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d973b0> <<< 12755 1727204075.64136: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 12755 1727204075.64160: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 12755 1727204075.64202: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d98590> <<< 12755 1727204075.64261: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 12755 1727204075.64265: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 12755 1727204075.64286: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 12755 1727204075.64331: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d9b020> <<< 12755 1727204075.64419: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197d9b170> <<< 12755 1727204075.64596: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d991f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d9eed0> <<< 12755 1727204075.64620: stdout chunk (state=3): >>>import '_tokenize' # <<< 12755 1727204075.64707: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d9d9a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d9d700> <<< 12755 1727204075.64714: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 12755 1727204075.64768: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d9ff20> <<< 12755 1727204075.64860: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d997f0> <<< 12755 1727204075.64892: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197de2f90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197de3140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 12755 1727204075.65002: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 12755 1727204075.65007: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197de8d10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197de8ad0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 12755 1727204075.65179: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 12755 1727204075.65182: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197deb2c0> <<< 12755 1727204075.65184: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197de9400> <<< 12755 1727204075.65246: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 12755 1727204075.65284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 12755 1727204075.65679: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 12755 1727204075.65728: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197df2ae0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197deb470> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197df3dd0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197df3bc0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197df3f20> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197de3440> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 12755 1727204075.65751: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 12755 1727204075.65778: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 12755 1727204075.65809: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 12755 1727204075.65839: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197df6b10> <<< 12755 1727204075.66033: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 12755 1727204075.66077: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197df7e90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197df52b0> <<< 12755 1727204075.66442: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197df6660> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197df4e60> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 12755 1727204075.66445: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.66447: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 12755 1727204075.66450: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.66577: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.66725: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.67407: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.68112: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 12755 1727204075.68196: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 12755 1727204075.68219: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197c80080> <<< 12755 1727204075.68335: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 12755 1727204075.68365: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197c810a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197dfb530> <<< 12755 1727204075.68419: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 12755 1727204075.68465: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12755 1727204075.68518: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 12755 1727204075.68672: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.68855: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 12755 1727204075.68888: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197c81220> # zipimport: zlib available <<< 12755 1727204075.69457: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.70018: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.70093: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.70187: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 12755 1727204075.70195: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.70273: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.70324: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 12755 1727204075.70403: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.70526: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 12755 1727204075.70556: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 12755 1727204075.70559: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.70605: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.70644: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 12755 1727204075.70658: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.70952: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.71231: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 12755 1727204075.71307: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 12755 1727204075.71329: stdout chunk (state=3): >>>import '_ast' # <<< 12755 1727204075.71415: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197c83560> <<< 12755 1727204075.71433: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.71518: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.71672: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 12755 1727204075.71811: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 12755 1727204075.71984: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197c89c40> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197c8a5a0> <<< 12755 1727204075.72020: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197c82690> # zipimport: zlib available # zipimport: zlib available <<< 12755 1727204075.72070: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 12755 1727204075.72129: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.72191: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.72266: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.72360: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 12755 1727204075.72404: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 12755 1727204075.72456: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197c89400> <<< 12755 1727204075.72501: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197c8a840> <<< 12755 1727204075.72546: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 12755 1727204075.72877: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 12755 1727204075.72951: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 12755 1727204075.72959: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 12755 1727204075.72962: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 12755 1727204075.72965: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 12755 1727204075.73009: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d22b10> <<< 12755 1727204075.73057: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197c947d0> <<< 12755 1727204075.73165: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197c93c50> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197c926f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 12755 1727204075.73224: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.73290: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 12755 1727204075.73331: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 12755 1727204075.73407: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.73505: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.73508: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.73532: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.73561: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.73659: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.73694: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.73753: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 12755 1727204075.73778: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.73875: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.73904: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.73971: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 12755 1727204075.74130: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.74397: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12755 1727204075.74437: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 12755 1727204075.74491: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 12755 1727204075.74525: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 12755 1727204075.74578: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d29100> <<< 12755 1727204075.74626: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 12755 1727204075.74669: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 12755 1727204075.74767: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 12755 1727204075.74771: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197764140> <<< 12755 1727204075.74837: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197764440> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197c751f0> <<< 12755 1727204075.74930: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197c747a0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d2b3b0> <<< 12755 1727204075.74948: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d2aab0> <<< 12755 1727204075.74969: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 12755 1727204075.74993: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 12755 1727204075.75030: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 12755 1727204075.75101: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 12755 1727204075.75113: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197767380> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197766c30> <<< 12755 1727204075.75148: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197766e10> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197766090> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 12755 1727204075.75277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 12755 1727204075.75311: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51977674d0> <<< 12755 1727204075.75336: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 12755 1727204075.75373: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 12755 1727204075.75379: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51977d2000> <<< 12755 1727204075.75453: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197767fe0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d2a840> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 12755 1727204075.75483: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 12755 1727204075.75516: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.75594: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.75795: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available <<< 12755 1727204075.75799: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 12755 1727204075.75870: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.75926: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 12755 1727204075.75977: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.76229: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 12755 1727204075.76246: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 12755 1727204075.76279: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.76358: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 12755 1727204075.76383: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.76916: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.77426: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 12755 1727204075.77510: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12755 1727204075.77563: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.77596: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.77643: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 12755 1727204075.77680: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12755 1727204075.77714: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 12755 1727204075.77786: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.77887: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 12755 1727204075.77967: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 12755 1727204075.78085: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 12755 1727204075.78253: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.78260: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 12755 1727204075.78266: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 12755 1727204075.78540: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51977d33b0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51977d2c60> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 12755 1727204075.78639: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 12755 1727204075.78712: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.78773: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 12755 1727204075.78895: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.79009: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available <<< 12755 1727204075.79058: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 12755 1727204075.79149: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 12755 1727204075.79246: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 12755 1727204075.79268: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51977fe2a0> <<< 12755 1727204075.79463: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51977eb140> <<< 12755 1727204075.79475: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 12755 1727204075.79525: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.79629: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 12755 1727204075.79898: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12755 1727204075.79941: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.80066: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 12755 1727204075.80229: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available <<< 12755 1727204075.80272: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 12755 1727204075.80315: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 12755 1727204075.80694: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51970a1df0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51970a1a60> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 12755 1727204075.80716: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available <<< 12755 1727204075.80842: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 12755 1727204075.80864: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.80959: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.81073: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.81198: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 12755 1727204075.81224: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12755 1727204075.81384: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.81549: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 12755 1727204075.81565: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.81685: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.81829: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 12755 1727204075.81873: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.81942: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.82549: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.83148: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 12755 1727204075.83171: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.83278: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.83413: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 12755 1727204075.83416: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.83507: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.83623: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 12755 1727204075.83798: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.83997: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 12755 1727204075.84027: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12755 1727204075.84030: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 12755 1727204075.84057: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.84114: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 12755 1727204075.84125: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.84226: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.84361: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.84726: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.84782: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 12755 1727204075.84811: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 12755 1727204075.84853: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.84968: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 12755 1727204075.85031: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.85113: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 12755 1727204075.85186: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 12755 1727204075.85232: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.85303: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 12755 1727204075.85367: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.85433: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 12755 1727204075.85449: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.86199: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.86341: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # <<< 12755 1727204075.86358: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.86422: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 12755 1727204075.86627: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # <<< 12755 1727204075.86631: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12755 1727204075.86648: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 12755 1727204075.86698: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.86788: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 12755 1727204075.86800: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.86814: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.86847: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.86905: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.86985: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.87093: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 12755 1727204075.87097: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.87132: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.87191: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 12755 1727204075.87227: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.87442: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.87669: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 12755 1727204075.87711: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.87777: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 12755 1727204075.87780: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.87901: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.87917: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 12755 1727204075.88008: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.88122: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 12755 1727204075.88125: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 12755 1727204075.88186: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204075.88496: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available <<< 12755 1727204075.88597: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 12755 1727204075.88643: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 12755 1727204075.88647: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 12755 1727204075.88695: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51970cb080> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51970cb2f0> <<< 12755 1727204075.88744: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51970c9760> <<< 12755 1727204076.06958: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 12755 1727204076.07055: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 12755 1727204076.07117: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197110ce0> <<< 12755 1727204076.07125: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 12755 1727204076.07237: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 12755 1727204076.07240: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197111c70> <<< 12755 1727204076.07263: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 12755 1727204076.07304: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197164380> <<< 12755 1727204076.07308: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51971121e0> <<< 12755 1727204076.07604: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 12755 1727204076.27891: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec295f006a273ed037dc1f179c04a840", "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_is_chroot": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPMunxA14oEo4zS2fFpjhbXGrPN+Pc6yYvB7bHw1sbQmdYiLO2rxhswFBK4xGCi3vLH7aJpii+0+X9RSsTnEBO0RK8RoPR2xkDNYXW0a1/AwnAaak2bYj0nyDuqJg37AS/oUOb1vdeV3ZYBHlx2BeYopb3qrr4hWKyEoB0Cj4GUfAAAAFQDdJ5ecc5ETygm9XhUUj7x91BVbMwAAAIEA3LP6y0wGAOTGTdHT9uHV9ZnnWPry3FD498XUhfd/8katSmv9dBqFZ5BSlmylNhNOGN/dgGvIysah3TjyiVgAhMIDSxyWXeNKylfCPrSiGgzM8sPtvUHAKjCr4YeqDBRpE2nuYpznop73ZNQ+pIgZqdMFXs4mhUw8Ai2Xc/5SU50AAACAdpxXHRObj5kiSZgGAlvwkslKIUteCSyoGibmNskfiBNzJY3St95HEK+e2xMQboTkyY3hrBqloLoBzSVdWeHcA4Dy35X8VTnKqPND6sF4ZlGbbCJ4i7j+NZNvY9YE7WlAvhx04nuXVm8WrfYDcMosawu6xUqwt2jyqxyZoW4D7vM=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDPMZg3FopBCtnsfqhDL1jxsml2miVdyRUl9+VLqXB9qIsKgdugZ5C55RjTiQ22wsa2Eh+b3yyIsvxFblaHxpQzIRnwsGaUu0F4hFCZXlaF5fc3O+m2QULrdxgV3MgQyWL48mVBiOB+GPPbs7QmzI86NB7uKRLNDd/a1TptTIakCXZG8IzEbICTS8L5pdUZ9xNLEft03pnuhGY/GBZ92mu+wYkGzptfYkjUD3tquOMvoARgvTTX7p7aXxFfSocK0lHZFLYlKMJRVt54wVcnxHlL5CKemFAnA9S+D8LZ5wZeoUJSE/kn/yvpO8XUisytKZyOmRFK/G3+cUWh6Pd7qYdOok4cofVXlyuTOw2mI5oI0U9r4iP+OK/lmhxRulzsX5W/l0DkEwkU1RuSRlvwu5f9EnfGb0i2T/oUti+RAH1DryJz8HsYqwWh73E/eQA3Syq8QjnIsYEPvoPNycnSAARw/gUeutemNV7jA6AoH96WGXVckMWfsvjlKleAA9RzgPc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDEVvjAxgU/tcZsEFgZpofCf1HGzA0uJ8gV8vzQgeRwbkH+VzA0Knn/ZxPSYccws9QCPb2xcCpDQhETAIG5RKHc=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAbPQMlsrcdV/DtPY+pi9Fcm1KJrxSB0LaYtXxUu2kxn", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_loadavg": {"1m": 0.52685546875, "5m": 0.<<< 12755 1727204076.27980: stdout chunk (state=3): >>>36962890625, "15m": 0.1767578125}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "", "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50712 10.31.11.210 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50712 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "35", "epoch": "1727204075", "epoch_int": "1727204075", "date": "2024-09-24", "time": "14:54:35", "iso8601_micro": "2024-09-24T18:54:35.899772Z", "iso8601": "2024-09-24T18:54:35Z", "iso8601_basic": "20240924T145435899772", "iso8601_basic_short": "20240924T145435", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "12:d4:45:6e:f8:dd", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.210", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d080:f60d:659:9515", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.210", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:d4:45:6e:f8:dd", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.210"], "ansible_all_ipv6_addresses": ["fe80::d080:f60d:659:9515"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.210", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d080:f60d:659:9515"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2833, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 884, "free": 2833}, "nocache": {"free": 3458, "used": 259}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec295f00-6a27-3ed0-37dc-1f179c04a840", "ansible_product_uuid": "ec295f00-6a27-3ed0-37dc-1f179c04a840", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 566, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251156553728, "block_size": 4096, "block_total": 64479564, "block_available": 61317518, "block_used": 3162046, "inode_total": 16384000, "inode_available": 16302250, "inode_used": 81750, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_lsb": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12755 1727204076.28797: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings <<< 12755 1727204076.28802: stdout chunk (state=3): >>># cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword <<< 12755 1727204076.28807: stdout chunk (state=3): >>># cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils <<< 12755 1727204076.28818: stdout chunk (state=3): >>># cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal <<< 12755 1727204076.28821: stdout chunk (state=3): >>># cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian <<< 12755 1727204076.28827: stdout chunk (state=3): >>># cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux <<< 12755 1727204076.28829: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context <<< 12755 1727204076.28832: stdout chunk (state=3): >>># cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python <<< 12755 1727204076.28838: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd <<< 12755 1727204076.28893: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb <<< 12755 1727204076.28921: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 12755 1727204076.29282: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 12755 1727204076.29304: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 12755 1727204076.29324: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 12755 1727204076.29364: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 12755 1727204076.29368: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob <<< 12755 1727204076.29374: stdout chunk (state=3): >>># destroy ipaddress <<< 12755 1727204076.29429: stdout chunk (state=3): >>># destroy ntpath <<< 12755 1727204076.29453: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 12755 1727204076.29460: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 12755 1727204076.29472: stdout chunk (state=3): >>># destroy _json <<< 12755 1727204076.29493: stdout chunk (state=3): >>># destroy grp # destroy encodings # destroy _locale <<< 12755 1727204076.29507: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 12755 1727204076.29521: stdout chunk (state=3): >>># destroy syslog <<< 12755 1727204076.29529: stdout chunk (state=3): >>># destroy uuid <<< 12755 1727204076.29566: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 12755 1727204076.29588: stdout chunk (state=3): >>># destroy distro # destroy distro.distro <<< 12755 1727204076.29596: stdout chunk (state=3): >>># destroy argparse # destroy logging <<< 12755 1727204076.29649: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal <<< 12755 1727204076.29652: stdout chunk (state=3): >>># destroy pickle # destroy _compat_pickle<<< 12755 1727204076.29670: stdout chunk (state=3): >>> <<< 12755 1727204076.29681: stdout chunk (state=3): >>># destroy _pickle # destroy queue <<< 12755 1727204076.29710: stdout chunk (state=3): >>># destroy _heapq # destroy _queue # destroy multiprocessing.reduction <<< 12755 1727204076.29723: stdout chunk (state=3): >>># destroy selectors <<< 12755 1727204076.29751: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess <<< 12755 1727204076.29764: stdout chunk (state=3): >>># destroy base64 <<< 12755 1727204076.29775: stdout chunk (state=3): >>># destroy _ssl <<< 12755 1727204076.29810: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 12755 1727204076.29821: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios # destroy json <<< 12755 1727204076.29850: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 12755 1727204076.29879: stdout chunk (state=3): >>># destroy glob <<< 12755 1727204076.29883: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 12755 1727204076.29888: stdout chunk (state=3): >>># destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing <<< 12755 1727204076.29908: stdout chunk (state=3): >>># destroy array # destroy multiprocessing.dummy.connection <<< 12755 1727204076.29949: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep <<< 12755 1727204076.29969: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian <<< 12755 1727204076.29985: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal <<< 12755 1727204076.29997: stdout chunk (state=3): >>># cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 12755 1727204076.30016: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 12755 1727204076.30036: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc <<< 12755 1727204076.30060: stdout chunk (state=3): >>># cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 12755 1727204076.30073: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg <<< 12755 1727204076.30099: stdout chunk (state=3): >>># cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 12755 1727204076.30137: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator <<< 12755 1727204076.30171: stdout chunk (state=3): >>># cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs <<< 12755 1727204076.30181: stdout chunk (state=3): >>># cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io<<< 12755 1727204076.30184: stdout chunk (state=3): >>> # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread <<< 12755 1727204076.30187: stdout chunk (state=3): >>># cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 12755 1727204076.30221: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 12755 1727204076.30390: stdout chunk (state=3): >>># destroy sys.monitoring <<< 12755 1727204076.30449: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 12755 1727204076.30483: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 12755 1727204076.30531: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize <<< 12755 1727204076.30554: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal<<< 12755 1727204076.30600: stdout chunk (state=3): >>> # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 12755 1727204076.30717: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 12755 1727204076.30730: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 12755 1727204076.30765: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 <<< 12755 1727204076.30817: stdout chunk (state=3): >>># destroy _sre # destroy _string # destroy re # destroy itertools <<< 12755 1727204076.30821: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 12755 1727204076.30841: stdout chunk (state=3): >>># clear sys.audit hooks <<< 12755 1727204076.31357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204076.31395: stderr chunk (state=3): >>><<< 12755 1727204076.31399: stdout chunk (state=3): >>><<< 12755 1727204076.31524: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f519840c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51983dbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f519840ea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51981bd0a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51981bdfd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51981fbe00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51981fbec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198233800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198233e90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198213ad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51982111f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51981f8fb0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198257710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198256330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51982121e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51981faea0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198288740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51981f8230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5198288bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198288aa0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5198288e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51981f6d50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198289550> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198289220> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f519828a450> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51982a4680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51982a5dc0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51982a6cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51982a7320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51982a6210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51982a7da0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51982a74d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f519828a4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197fa3d10> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197fcc710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197fcc470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197fcc740> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197fcc920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197fa1eb0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197fcdf70> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197fccbf0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f519828aba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197ffa2d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51980123f0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f519804f1d0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198071970> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f519804f2f0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198013080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197e982c0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5198011430> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197fcee70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5198011550> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_vpumxyeq/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197efe030> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197ed4f20> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197e9bfb0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197ed7ec0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197f31af0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197f31880> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197f31190> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197f315e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197efecc0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197f328a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197f32ae0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197f32fc0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d94e00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197d96a20> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d973b0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d98590> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d9b020> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197d9b170> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d991f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d9eed0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d9d9a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d9d700> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d9ff20> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d997f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197de2f90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197de3140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197de8d10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197de8ad0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197deb2c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197de9400> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197df2ae0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197deb470> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197df3dd0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197df3bc0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197df3f20> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197de3440> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197df6b10> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197df7e90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197df52b0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197df6660> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197df4e60> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197c80080> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197c810a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197dfb530> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197c81220> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197c83560> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197c89c40> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197c8a5a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197c82690> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197c89400> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197c8a840> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d22b10> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197c947d0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197c93c50> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197c926f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d29100> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197764140> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197764440> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197c751f0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197c747a0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d2b3b0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d2aab0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197767380> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197766c30> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5197766e10> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197766090> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51977674d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51977d2000> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197767fe0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197d2a840> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51977d33b0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51977d2c60> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51977fe2a0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51977eb140> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51970a1df0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51970a1a60> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f51970cb080> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51970cb2f0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51970c9760> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197110ce0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197111c70> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5197164380> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f51971121e0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec295f006a273ed037dc1f179c04a840", "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_is_chroot": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPMunxA14oEo4zS2fFpjhbXGrPN+Pc6yYvB7bHw1sbQmdYiLO2rxhswFBK4xGCi3vLH7aJpii+0+X9RSsTnEBO0RK8RoPR2xkDNYXW0a1/AwnAaak2bYj0nyDuqJg37AS/oUOb1vdeV3ZYBHlx2BeYopb3qrr4hWKyEoB0Cj4GUfAAAAFQDdJ5ecc5ETygm9XhUUj7x91BVbMwAAAIEA3LP6y0wGAOTGTdHT9uHV9ZnnWPry3FD498XUhfd/8katSmv9dBqFZ5BSlmylNhNOGN/dgGvIysah3TjyiVgAhMIDSxyWXeNKylfCPrSiGgzM8sPtvUHAKjCr4YeqDBRpE2nuYpznop73ZNQ+pIgZqdMFXs4mhUw8Ai2Xc/5SU50AAACAdpxXHRObj5kiSZgGAlvwkslKIUteCSyoGibmNskfiBNzJY3St95HEK+e2xMQboTkyY3hrBqloLoBzSVdWeHcA4Dy35X8VTnKqPND6sF4ZlGbbCJ4i7j+NZNvY9YE7WlAvhx04nuXVm8WrfYDcMosawu6xUqwt2jyqxyZoW4D7vM=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDPMZg3FopBCtnsfqhDL1jxsml2miVdyRUl9+VLqXB9qIsKgdugZ5C55RjTiQ22wsa2Eh+b3yyIsvxFblaHxpQzIRnwsGaUu0F4hFCZXlaF5fc3O+m2QULrdxgV3MgQyWL48mVBiOB+GPPbs7QmzI86NB7uKRLNDd/a1TptTIakCXZG8IzEbICTS8L5pdUZ9xNLEft03pnuhGY/GBZ92mu+wYkGzptfYkjUD3tquOMvoARgvTTX7p7aXxFfSocK0lHZFLYlKMJRVt54wVcnxHlL5CKemFAnA9S+D8LZ5wZeoUJSE/kn/yvpO8XUisytKZyOmRFK/G3+cUWh6Pd7qYdOok4cofVXlyuTOw2mI5oI0U9r4iP+OK/lmhxRulzsX5W/l0DkEwkU1RuSRlvwu5f9EnfGb0i2T/oUti+RAH1DryJz8HsYqwWh73E/eQA3Syq8QjnIsYEPvoPNycnSAARw/gUeutemNV7jA6AoH96WGXVckMWfsvjlKleAA9RzgPc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDEVvjAxgU/tcZsEFgZpofCf1HGzA0uJ8gV8vzQgeRwbkH+VzA0Knn/ZxPSYccws9QCPb2xcCpDQhETAIG5RKHc=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAbPQMlsrcdV/DtPY+pi9Fcm1KJrxSB0LaYtXxUu2kxn", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_loadavg": {"1m": 0.52685546875, "5m": 0.36962890625, "15m": 0.1767578125}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "", "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50712 10.31.11.210 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50712 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "35", "epoch": "1727204075", "epoch_int": "1727204075", "date": "2024-09-24", "time": "14:54:35", "iso8601_micro": "2024-09-24T18:54:35.899772Z", "iso8601": "2024-09-24T18:54:35Z", "iso8601_basic": "20240924T145435899772", "iso8601_basic_short": "20240924T145435", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "12:d4:45:6e:f8:dd", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.210", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d080:f60d:659:9515", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.210", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:d4:45:6e:f8:dd", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.210"], "ansible_all_ipv6_addresses": ["fe80::d080:f60d:659:9515"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.210", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d080:f60d:659:9515"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2833, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 884, "free": 2833}, "nocache": {"free": 3458, "used": 259}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec295f00-6a27-3ed0-37dc-1f179c04a840", "ansible_product_uuid": "ec295f00-6a27-3ed0-37dc-1f179c04a840", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 566, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251156553728, "block_size": 4096, "block_total": 64479564, "block_available": 61317518, "block_used": 3162046, "inode_total": 16384000, "inode_available": 16302250, "inode_used": 81750, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_lsb": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 12755 1727204076.32486: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204074.8385658-12789-180993002939808/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204076.32493: _low_level_execute_command(): starting 12755 1727204076.32496: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204074.8385658-12789-180993002939808/ > /dev/null 2>&1 && sleep 0' 12755 1727204076.32639: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204076.32642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204076.32645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204076.32647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204076.32696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204076.32715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204076.32753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204076.34730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204076.34771: stderr chunk (state=3): >>><<< 12755 1727204076.34775: stdout chunk (state=3): >>><<< 12755 1727204076.34794: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204076.34803: handler run complete 12755 1727204076.34924: variable 'ansible_facts' from source: unknown 12755 1727204076.35004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204076.35267: variable 'ansible_facts' from source: unknown 12755 1727204076.35339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204076.36187: attempt loop complete, returning result 12755 1727204076.36191: _execute() done 12755 1727204076.36201: dumping result to json 12755 1727204076.36224: done dumping result, returning 12755 1727204076.36232: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [12b410aa-8751-72e9-1a19-0000000001bc] 12755 1727204076.36237: sending task result for task 12b410aa-8751-72e9-1a19-0000000001bc 12755 1727204076.36558: done sending task result for task 12b410aa-8751-72e9-1a19-0000000001bc 12755 1727204076.36561: WORKER PROCESS EXITING ok: [managed-node1] 12755 1727204076.37051: no more pending results, returning what we have 12755 1727204076.37055: results queue empty 12755 1727204076.37056: checking for any_errors_fatal 12755 1727204076.37058: done checking for any_errors_fatal 12755 1727204076.37059: checking for max_fail_percentage 12755 1727204076.37061: done checking for max_fail_percentage 12755 1727204076.37062: checking to see if all hosts have failed and the running result is not ok 12755 1727204076.37064: done checking to see if all hosts have failed 12755 1727204076.37065: getting the remaining hosts for this loop 12755 1727204076.37067: done getting the remaining hosts for this loop 12755 1727204076.37071: getting the next task for host managed-node1 12755 1727204076.37083: done getting next task for host managed-node1 12755 1727204076.37088: ^ task is: TASK: meta (flush_handlers) 12755 1727204076.37121: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204076.37128: getting variables 12755 1727204076.37130: in VariableManager get_vars() 12755 1727204076.37158: Calling all_inventory to load vars for managed-node1 12755 1727204076.37162: Calling groups_inventory to load vars for managed-node1 12755 1727204076.37166: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204076.37178: Calling all_plugins_play to load vars for managed-node1 12755 1727204076.37182: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204076.37331: Calling groups_plugins_play to load vars for managed-node1 12755 1727204076.37674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204076.38070: done with get_vars() 12755 1727204076.38086: done getting variables 12755 1727204076.38188: in VariableManager get_vars() 12755 1727204076.38199: Calling all_inventory to load vars for managed-node1 12755 1727204076.38201: Calling groups_inventory to load vars for managed-node1 12755 1727204076.38202: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204076.38206: Calling all_plugins_play to load vars for managed-node1 12755 1727204076.38208: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204076.38210: Calling groups_plugins_play to load vars for managed-node1 12755 1727204076.38375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204076.38529: done with get_vars() 12755 1727204076.38540: done queuing things up, now waiting for results queue to drain 12755 1727204076.38542: results queue empty 12755 1727204076.38542: checking for any_errors_fatal 12755 1727204076.38544: done checking for any_errors_fatal 12755 1727204076.38545: checking for max_fail_percentage 12755 1727204076.38545: done checking for max_fail_percentage 12755 1727204076.38546: checking to see if all hosts have failed and the running result is not ok 12755 1727204076.38549: done checking to see if all hosts have failed 12755 1727204076.38550: getting the remaining hosts for this loop 12755 1727204076.38551: done getting the remaining hosts for this loop 12755 1727204076.38553: getting the next task for host managed-node1 12755 1727204076.38557: done getting next task for host managed-node1 12755 1727204076.38558: ^ task is: TASK: Include the task 'el_repo_setup.yml' 12755 1727204076.38560: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204076.38561: getting variables 12755 1727204076.38562: in VariableManager get_vars() 12755 1727204076.38567: Calling all_inventory to load vars for managed-node1 12755 1727204076.38569: Calling groups_inventory to load vars for managed-node1 12755 1727204076.38571: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204076.38580: Calling all_plugins_play to load vars for managed-node1 12755 1727204076.38587: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204076.38595: Calling groups_plugins_play to load vars for managed-node1 12755 1727204076.38774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204076.39003: done with get_vars() 12755 1727204076.39010: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:11 Tuesday 24 September 2024 14:54:36 -0400 (0:00:01.609) 0:00:01.626 ***** 12755 1727204076.39073: entering _queue_task() for managed-node1/include_tasks 12755 1727204076.39075: Creating lock for include_tasks 12755 1727204076.39343: worker is 1 (out of 1 available) 12755 1727204076.39357: exiting _queue_task() for managed-node1/include_tasks 12755 1727204076.39369: done queuing things up, now waiting for results queue to drain 12755 1727204076.39371: waiting for pending results... 12755 1727204076.39608: running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' 12755 1727204076.39663: in run() - task 12b410aa-8751-72e9-1a19-000000000006 12755 1727204076.39676: variable 'ansible_search_path' from source: unknown 12755 1727204076.39711: calling self._execute() 12755 1727204076.39779: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204076.39786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204076.39799: variable 'omit' from source: magic vars 12755 1727204076.39892: _execute() done 12755 1727204076.39896: dumping result to json 12755 1727204076.39899: done dumping result, returning 12755 1727204076.39907: done running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' [12b410aa-8751-72e9-1a19-000000000006] 12755 1727204076.39917: sending task result for task 12b410aa-8751-72e9-1a19-000000000006 12755 1727204076.40014: done sending task result for task 12b410aa-8751-72e9-1a19-000000000006 12755 1727204076.40021: WORKER PROCESS EXITING 12755 1727204076.40070: no more pending results, returning what we have 12755 1727204076.40075: in VariableManager get_vars() 12755 1727204076.40104: Calling all_inventory to load vars for managed-node1 12755 1727204076.40107: Calling groups_inventory to load vars for managed-node1 12755 1727204076.40111: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204076.40123: Calling all_plugins_play to load vars for managed-node1 12755 1727204076.40126: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204076.40130: Calling groups_plugins_play to load vars for managed-node1 12755 1727204076.40426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204076.40571: done with get_vars() 12755 1727204076.40577: variable 'ansible_search_path' from source: unknown 12755 1727204076.40587: we have included files to process 12755 1727204076.40587: generating all_blocks data 12755 1727204076.40588: done generating all_blocks data 12755 1727204076.40591: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 12755 1727204076.40592: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 12755 1727204076.40594: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 12755 1727204076.41128: in VariableManager get_vars() 12755 1727204076.41140: done with get_vars() 12755 1727204076.41151: done processing included file 12755 1727204076.41152: iterating over new_blocks loaded from include file 12755 1727204076.41154: in VariableManager get_vars() 12755 1727204076.41163: done with get_vars() 12755 1727204076.41164: filtering new block on tags 12755 1727204076.41176: done filtering new block on tags 12755 1727204076.41178: in VariableManager get_vars() 12755 1727204076.41201: done with get_vars() 12755 1727204076.41202: filtering new block on tags 12755 1727204076.41217: done filtering new block on tags 12755 1727204076.41219: in VariableManager get_vars() 12755 1727204076.41227: done with get_vars() 12755 1727204076.41228: filtering new block on tags 12755 1727204076.41238: done filtering new block on tags 12755 1727204076.41239: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node1 12755 1727204076.41243: extending task lists for all hosts with included blocks 12755 1727204076.41285: done extending task lists 12755 1727204076.41286: done processing included files 12755 1727204076.41286: results queue empty 12755 1727204076.41287: checking for any_errors_fatal 12755 1727204076.41288: done checking for any_errors_fatal 12755 1727204076.41288: checking for max_fail_percentage 12755 1727204076.41290: done checking for max_fail_percentage 12755 1727204076.41291: checking to see if all hosts have failed and the running result is not ok 12755 1727204076.41292: done checking to see if all hosts have failed 12755 1727204076.41292: getting the remaining hosts for this loop 12755 1727204076.41293: done getting the remaining hosts for this loop 12755 1727204076.41295: getting the next task for host managed-node1 12755 1727204076.41298: done getting next task for host managed-node1 12755 1727204076.41299: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 12755 1727204076.41301: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204076.41302: getting variables 12755 1727204076.41303: in VariableManager get_vars() 12755 1727204076.41309: Calling all_inventory to load vars for managed-node1 12755 1727204076.41310: Calling groups_inventory to load vars for managed-node1 12755 1727204076.41312: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204076.41318: Calling all_plugins_play to load vars for managed-node1 12755 1727204076.41320: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204076.41322: Calling groups_plugins_play to load vars for managed-node1 12755 1727204076.41431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204076.41579: done with get_vars() 12755 1727204076.41587: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:54:36 -0400 (0:00:00.025) 0:00:01.652 ***** 12755 1727204076.41643: entering _queue_task() for managed-node1/setup 12755 1727204076.41835: worker is 1 (out of 1 available) 12755 1727204076.41848: exiting _queue_task() for managed-node1/setup 12755 1727204076.41859: done queuing things up, now waiting for results queue to drain 12755 1727204076.41861: waiting for pending results... 12755 1727204076.42001: running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 12755 1727204076.42067: in run() - task 12b410aa-8751-72e9-1a19-0000000001cd 12755 1727204076.42077: variable 'ansible_search_path' from source: unknown 12755 1727204076.42080: variable 'ansible_search_path' from source: unknown 12755 1727204076.42140: calling self._execute() 12755 1727204076.42206: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204076.42213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204076.42226: variable 'omit' from source: magic vars 12755 1727204076.42593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204076.44458: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204076.44463: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204076.44466: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204076.44533: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204076.44579: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204076.44652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204076.44755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204076.44759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204076.44839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204076.44843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204076.45056: variable 'ansible_facts' from source: unknown 12755 1727204076.45116: variable 'network_test_required_facts' from source: task vars 12755 1727204076.45170: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 12755 1727204076.45256: variable 'omit' from source: magic vars 12755 1727204076.45259: variable 'omit' from source: magic vars 12755 1727204076.45278: variable 'omit' from source: magic vars 12755 1727204076.45303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204076.45334: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204076.45366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204076.45384: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204076.45395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204076.45513: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204076.45516: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204076.45519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204076.45577: Set connection var ansible_connection to ssh 12755 1727204076.45590: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204076.45593: Set connection var ansible_shell_type to sh 12755 1727204076.45605: Set connection var ansible_timeout to 10 12755 1727204076.45612: Set connection var ansible_shell_executable to /bin/sh 12755 1727204076.45621: Set connection var ansible_pipelining to False 12755 1727204076.45666: variable 'ansible_shell_executable' from source: unknown 12755 1727204076.45669: variable 'ansible_connection' from source: unknown 12755 1727204076.45672: variable 'ansible_module_compression' from source: unknown 12755 1727204076.45684: variable 'ansible_shell_type' from source: unknown 12755 1727204076.45704: variable 'ansible_shell_executable' from source: unknown 12755 1727204076.45711: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204076.45732: variable 'ansible_pipelining' from source: unknown 12755 1727204076.45735: variable 'ansible_timeout' from source: unknown 12755 1727204076.45738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204076.45907: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204076.45922: variable 'omit' from source: magic vars 12755 1727204076.45936: starting attempt loop 12755 1727204076.45940: running the handler 12755 1727204076.45970: _low_level_execute_command(): starting 12755 1727204076.45973: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204076.46669: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204076.46678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204076.46681: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204076.46684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204076.46784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204076.46844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204076.48646: stdout chunk (state=3): >>>/root <<< 12755 1727204076.48751: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204076.48811: stderr chunk (state=3): >>><<< 12755 1727204076.48814: stdout chunk (state=3): >>><<< 12755 1727204076.48831: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204076.48852: _low_level_execute_command(): starting 12755 1727204076.48858: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204076.488405-12859-83052676305940 `" && echo ansible-tmp-1727204076.488405-12859-83052676305940="` echo /root/.ansible/tmp/ansible-tmp-1727204076.488405-12859-83052676305940 `" ) && sleep 0' 12755 1727204076.49465: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204076.49468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204076.49470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204076.49474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204076.49476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204076.49479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204076.49532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204076.49536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204076.49590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204076.51634: stdout chunk (state=3): >>>ansible-tmp-1727204076.488405-12859-83052676305940=/root/.ansible/tmp/ansible-tmp-1727204076.488405-12859-83052676305940 <<< 12755 1727204076.51750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204076.51802: stderr chunk (state=3): >>><<< 12755 1727204076.51806: stdout chunk (state=3): >>><<< 12755 1727204076.51824: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204076.488405-12859-83052676305940=/root/.ansible/tmp/ansible-tmp-1727204076.488405-12859-83052676305940 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204076.51864: variable 'ansible_module_compression' from source: unknown 12755 1727204076.51909: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12755 1727204076.51955: variable 'ansible_facts' from source: unknown 12755 1727204076.52075: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204076.488405-12859-83052676305940/AnsiballZ_setup.py 12755 1727204076.52196: Sending initial data 12755 1727204076.52199: Sent initial data (152 bytes) 12755 1727204076.52721: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204076.52725: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204076.52729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204076.52813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204076.52869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204076.54547: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 12755 1727204076.54554: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204076.54590: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204076.54642: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmparvju_5b /root/.ansible/tmp/ansible-tmp-1727204076.488405-12859-83052676305940/AnsiballZ_setup.py <<< 12755 1727204076.54644: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204076.488405-12859-83052676305940/AnsiballZ_setup.py" <<< 12755 1727204076.54679: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmparvju_5b" to remote "/root/.ansible/tmp/ansible-tmp-1727204076.488405-12859-83052676305940/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204076.488405-12859-83052676305940/AnsiballZ_setup.py" <<< 12755 1727204076.56359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204076.56469: stderr chunk (state=3): >>><<< 12755 1727204076.56473: stdout chunk (state=3): >>><<< 12755 1727204076.56514: done transferring module to remote 12755 1727204076.56525: _low_level_execute_command(): starting 12755 1727204076.56530: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204076.488405-12859-83052676305940/ /root/.ansible/tmp/ansible-tmp-1727204076.488405-12859-83052676305940/AnsiballZ_setup.py && sleep 0' 12755 1727204076.57100: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204076.57108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204076.57154: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204076.57197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204076.57204: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204076.57253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204076.59174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204076.59222: stderr chunk (state=3): >>><<< 12755 1727204076.59225: stdout chunk (state=3): >>><<< 12755 1727204076.59240: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204076.59243: _low_level_execute_command(): starting 12755 1727204076.59249: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204076.488405-12859-83052676305940/AnsiballZ_setup.py && sleep 0' 12755 1727204076.59848: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204076.59851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204076.59855: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204076.59858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204076.59934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204076.59940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204076.59991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204076.62264: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 12755 1727204076.62301: stdout chunk (state=3): >>>import _imp # builtin <<< 12755 1727204076.62330: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 12755 1727204076.62414: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 12755 1727204076.62462: stdout chunk (state=3): >>>import 'posix' # <<< 12755 1727204076.62494: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 12755 1727204076.62526: stdout chunk (state=3): >>>import 'time' # <<< 12755 1727204076.62540: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 12755 1727204076.62598: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 12755 1727204076.62610: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 12755 1727204076.62638: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # <<< 12755 1727204076.62709: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 12755 1727204076.62724: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a900c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8fdbad0> <<< 12755 1727204076.62754: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 12755 1727204076.62782: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a900ea20> <<< 12755 1727204076.62800: stdout chunk (state=3): >>>import '_signal' # <<< 12755 1727204076.62824: stdout chunk (state=3): >>>import '_abc' # <<< 12755 1727204076.62848: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 12755 1727204076.62901: stdout chunk (state=3): >>>import '_stat' # <<< 12755 1727204076.62978: stdout chunk (state=3): >>>import 'stat' # import '_collections_abc' # <<< 12755 1727204076.63020: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 12755 1727204076.63049: stdout chunk (state=3): >>>import 'os' # <<< 12755 1727204076.63069: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 12755 1727204076.63106: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' <<< 12755 1727204076.63130: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 12755 1727204076.63159: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 12755 1727204076.63185: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e210a0> <<< 12755 1727204076.63263: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 12755 1727204076.63266: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 12755 1727204076.63303: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e21fd0> <<< 12755 1727204076.63306: stdout chunk (state=3): >>>import 'site' # <<< 12755 1727204076.63335: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 12755 1727204076.63783: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 12755 1727204076.63896: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 12755 1727204076.63942: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 12755 1727204076.64022: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e5fec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 12755 1727204076.64076: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e5ff80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 12755 1727204076.64151: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 12755 1727204076.64198: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e978c0> <<< 12755 1727204076.64245: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e97f50> import '_collections' # <<< 12755 1727204076.64282: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e77b60> <<< 12755 1727204076.64297: stdout chunk (state=3): >>>import '_functools' # <<< 12755 1727204076.64328: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e752b0> <<< 12755 1727204076.64428: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e5d070> <<< 12755 1727204076.64450: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 12755 1727204076.64486: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 12755 1727204076.64517: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 12755 1727204076.64537: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 12755 1727204076.64617: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 12755 1727204076.64673: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8ebb890> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8eba4b0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 12755 1727204076.64711: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e762a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8eb8bc0> <<< 12755 1727204076.64751: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8eec800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e5c2f0> <<< 12755 1727204076.64774: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 12755 1727204076.64809: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8eeccb0> <<< 12755 1727204076.64841: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8eecb60> <<< 12755 1727204076.64981: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8eecf50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e5ae10> <<< 12755 1727204076.65126: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8eed610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8eed2e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 12755 1727204076.65134: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8eee510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 12755 1727204076.65301: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8f08740> import 'errno' # <<< 12755 1727204076.65357: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8f09e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 12755 1727204076.65408: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8f0ad80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8f0b3e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8f0a2d0> <<< 12755 1727204076.65485: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 12755 1727204076.65490: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8f0be30> <<< 12755 1727204076.65506: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8f0b560> <<< 12755 1727204076.65553: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8eee570> <<< 12755 1727204076.65588: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 12755 1727204076.65617: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 12755 1727204076.65674: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 12755 1727204076.65768: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8c4bd40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 12755 1727204076.65856: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8c74860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8c745c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8c74890> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8c74a70> <<< 12755 1727204076.65902: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8c49ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 12755 1727204076.66071: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8c76180> <<< 12755 1727204076.66096: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8c74e00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8eeec60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 12755 1727204076.66148: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 12755 1727204076.66165: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 12755 1727204076.66231: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 12755 1727204076.66269: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8c9e510> <<< 12755 1727204076.66335: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 12755 1727204076.66356: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 12755 1727204076.66379: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 12755 1727204076.66457: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8cba690> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 12755 1727204076.66546: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 12755 1727204076.66554: stdout chunk (state=3): >>>import 'ntpath' # <<< 12755 1727204076.66604: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8cef410> <<< 12755 1727204076.66607: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 12755 1727204076.66640: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 12755 1727204076.66664: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 12755 1727204076.66713: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 12755 1727204076.66809: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8d19bb0> <<< 12755 1727204076.66894: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8cef530> <<< 12755 1727204076.66934: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8cbb320> <<< 12755 1727204076.66977: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8af84a0> <<< 12755 1727204076.67001: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8cb96d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8c770b0> <<< 12755 1727204076.67292: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f29a8cb97f0> <<< 12755 1727204076.67406: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_dvu4qrdz/ansible_setup_payload.zip' # zipimport: zlib available <<< 12755 1727204076.67717: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 12755 1727204076.67746: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8b660c0> <<< 12755 1727204076.67768: stdout chunk (state=3): >>>import '_typing' # <<< 12755 1727204076.67958: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8b3d040> <<< 12755 1727204076.67979: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8b3c1a0> # zipimport: zlib available <<< 12755 1727204076.68008: stdout chunk (state=3): >>>import 'ansible' # <<< 12755 1727204076.68053: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12755 1727204076.68076: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 12755 1727204076.68172: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.69685: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.71021: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8b3ffb0> <<< 12755 1727204076.71053: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 12755 1727204076.71083: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 12755 1727204076.71104: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 12755 1727204076.71123: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 12755 1727204076.71157: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8b95be0> <<< 12755 1727204076.71209: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8b95970> <<< 12755 1727204076.71239: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8b95280> <<< 12755 1727204076.71273: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 12755 1727204076.71277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 12755 1727204076.71323: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8b959d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8b66b70> <<< 12755 1727204076.71327: stdout chunk (state=3): >>>import 'atexit' # <<< 12755 1727204076.71361: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8b96960> <<< 12755 1727204076.71401: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8b96ba0> <<< 12755 1727204076.71421: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 12755 1727204076.71477: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 12755 1727204076.71480: stdout chunk (state=3): >>>import '_locale' # <<< 12755 1727204076.71562: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8b97020> <<< 12755 1727204076.71566: stdout chunk (state=3): >>>import 'pwd' # <<< 12755 1727204076.71568: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 12755 1727204076.71592: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 12755 1727204076.71634: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a89fce30> <<< 12755 1727204076.71664: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a89fea50> <<< 12755 1727204076.71687: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 12755 1727204076.71712: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 12755 1727204076.71746: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a89ff410> <<< 12755 1727204076.71808: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 12755 1727204076.71849: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a005f0> <<< 12755 1727204076.71853: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 12755 1727204076.71916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 12755 1727204076.72012: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a030e0> <<< 12755 1727204076.72047: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8a03230> <<< 12755 1727204076.72050: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a013a0> <<< 12755 1727204076.72070: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 12755 1727204076.72110: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 12755 1727204076.72294: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 12755 1727204076.72330: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a070b0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a05b80> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a058e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 12755 1727204076.72353: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 12755 1727204076.72421: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a07c20> <<< 12755 1727204076.72451: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a018b0> <<< 12755 1727204076.72480: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8a4b1d0> <<< 12755 1727204076.72518: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a4b320> <<< 12755 1727204076.72548: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 12755 1727204076.72656: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 12755 1727204076.72669: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8a50ef0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a50cb0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 12755 1727204076.72787: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 12755 1727204076.72875: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8a53440> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a515e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 12755 1727204076.72920: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 12755 1727204076.72955: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 12755 1727204076.72997: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 12755 1727204076.73027: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a5ac30> <<< 12755 1727204076.73187: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a535c0> <<< 12755 1727204076.73328: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8a5bef0> <<< 12755 1727204076.73331: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8a5ba70> <<< 12755 1727204076.73435: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8a5bd40> <<< 12755 1727204076.73459: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a4b620> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 12755 1727204076.73501: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 12755 1727204076.73538: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 12755 1727204076.73557: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8a5ebd0> <<< 12755 1727204076.73778: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 12755 1727204076.73782: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8a5ffe0> <<< 12755 1727204076.73874: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a5d370> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8a5e720> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a5cf20> <<< 12755 1727204076.73898: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 12755 1727204076.73980: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.74112: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.74331: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 12755 1727204076.74335: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 12755 1727204076.74337: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12755 1727204076.74462: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.75177: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.75888: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 12755 1727204076.75915: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 12755 1727204076.75969: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 12755 1727204076.76064: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 12755 1727204076.76100: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a88e8260> <<< 12755 1727204076.76136: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 12755 1727204076.76164: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a88e90a0> <<< 12755 1727204076.76305: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a635f0> <<< 12755 1727204076.76340: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 12755 1727204076.76485: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.76672: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 12755 1727204076.76701: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a88e8e60> # zipimport: zlib available <<< 12755 1727204076.77289: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.78149: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 12755 1727204076.78208: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.78346: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 12755 1727204076.78353: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.78670: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available <<< 12755 1727204076.78674: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 12755 1727204076.78676: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.78798: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.79077: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 12755 1727204076.79116: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 12755 1727204076.79138: stdout chunk (state=3): >>>import '_ast' # <<< 12755 1727204076.79240: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a88eb6b0> <<< 12755 1727204076.79243: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.79318: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.79421: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 12755 1727204076.79450: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 12755 1727204076.79466: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 12755 1727204076.79553: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 12755 1727204076.79674: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a88f1b80> <<< 12755 1727204076.79732: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a88f24e0> <<< 12755 1727204076.79754: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e5ffb0> # zipimport: zlib available <<< 12755 1727204076.79803: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.79840: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 12755 1727204076.79861: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.79904: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.79945: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.80011: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.80082: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 12755 1727204076.80127: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 12755 1727204076.80240: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a88f13d0> <<< 12755 1727204076.80273: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a88f26c0> <<< 12755 1727204076.80456: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 12755 1727204076.80459: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12755 1727204076.80484: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.80532: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 12755 1727204076.80781: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8986900> <<< 12755 1727204076.80811: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a88fc530> <<< 12755 1727204076.80913: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a88fa6c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a88fa510> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 12755 1727204076.80935: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.80965: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 12755 1727204076.81109: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 12755 1727204076.81146: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available <<< 12755 1727204076.81209: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.81237: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.81259: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.81662: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 12755 1727204076.81858: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.82057: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.82105: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.82185: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 12755 1727204076.82215: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 12755 1727204076.82331: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a89890d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 12755 1727204076.82403: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 12755 1727204076.82715: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a7eb3f50> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a7eb8320> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a89610a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8960380> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a898b380> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a898acc0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 12755 1727204076.82719: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 12755 1727204076.82722: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 12755 1727204076.82724: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 12755 1727204076.82793: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a7ebb2c0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a7ebab70> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a7ebad50> <<< 12755 1727204076.82829: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a7eb9fa0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 12755 1727204076.83142: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a7ebb3e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a7f25f10> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a7ebbef0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a898ae70> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 12755 1727204076.83195: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12755 1727204076.83211: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 12755 1727204076.83323: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # <<< 12755 1727204076.83379: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12755 1727204076.83448: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available <<< 12755 1727204076.83477: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 12755 1727204076.83507: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.83525: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.83557: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 12755 1727204076.83747: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.83754: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 12755 1727204076.83757: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.83895: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 12755 1727204076.83899: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.83901: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.83995: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.84029: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 12755 1727204076.84062: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.84635: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.85105: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 12755 1727204076.85120: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.85166: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.85223: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.85262: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.85306: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 12755 1727204076.85435: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 12755 1727204076.85490: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.85519: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 12755 1727204076.85548: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.85584: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 12755 1727204076.85722: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12755 1727204076.85741: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 12755 1727204076.85758: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.85849: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 12755 1727204076.85897: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a7f27ad0> <<< 12755 1727204076.85946: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 12755 1727204076.85965: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 12755 1727204076.86064: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a7f26c00> <<< 12755 1727204076.86125: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 12755 1727204076.86162: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.86255: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 12755 1727204076.86352: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.86441: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 12755 1727204076.86474: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.86526: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.86613: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 12755 1727204076.86677: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.86803: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 12755 1727204076.86851: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 12755 1727204076.86929: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a7f56240> <<< 12755 1727204076.87148: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a7f3eed0> import 'ansible.module_utils.facts.system.python' # <<< 12755 1727204076.87162: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.87543: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 12755 1727204076.87547: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.87609: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.88079: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a7d6deb0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a7f3fa10> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 12755 1727204076.88112: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 12755 1727204076.88233: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 12755 1727204076.88375: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.88560: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 12755 1727204076.88677: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.88783: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.88834: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.88890: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 12755 1727204076.88917: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.88975: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.88991: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.89116: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.89280: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 12755 1727204076.89298: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.89437: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.89569: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 12755 1727204076.89592: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.89613: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.89654: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.90402: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.90951: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 12755 1727204076.91056: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 12755 1727204076.91087: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.91213: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 12755 1727204076.91231: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.91331: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.91459: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 12755 1727204076.91463: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.91630: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.91842: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 12755 1727204076.91846: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 12755 1727204076.91866: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.91902: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.91961: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 12755 1727204076.92152: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.92179: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.92431: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.92649: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 12755 1727204076.92708: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.92748: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # <<< 12755 1727204076.92846: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 12755 1727204076.93020: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 12755 1727204076.93057: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # <<< 12755 1727204076.93061: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.93130: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.93238: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 12755 1727204076.93241: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.93257: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.93362: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 12755 1727204076.93452: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.93688: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.93945: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 12755 1727204076.94023: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12755 1727204076.94141: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 12755 1727204076.94144: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.94173: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 12755 1727204076.94194: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.94221: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.94274: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 12755 1727204076.94278: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.94359: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # <<< 12755 1727204076.94362: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.94468: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.94676: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 12755 1727204076.94700: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 12755 1727204076.94754: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12755 1727204076.94796: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.94847: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.94925: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.95001: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 12755 1727204076.95033: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 12755 1727204076.95083: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.95140: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 12755 1727204076.95163: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.95378: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.95615: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 12755 1727204076.95669: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.95720: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 12755 1727204076.95748: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.95790: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.95848: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 12755 1727204076.95864: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.95952: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.96039: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 12755 1727204076.96057: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.96198: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.96259: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 12755 1727204076.96364: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204076.96573: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc'<<< 12755 1727204076.96580: stdout chunk (state=3): >>> <<< 12755 1727204076.96642: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 12755 1727204076.96658: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a7d96ed0> <<< 12755 1727204076.96672: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a7d95cd0> <<< 12755 1727204076.96739: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a7d94c50> <<< 12755 1727204076.97865: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec295f006a273ed037dc1f179c04a840", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "36", "epoch": "1727204076", "epoch_int": "1727204076", "date": "2024-09-24", "time": "14:54:36", "iso8601_micro": "2024-09-24T18:54:36.969450Z", "iso8601": "2024-09-24T18:54:36Z", "iso8601_basic": "20240924T145436969450", "iso8601_basic_short": "20240924T145436", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPMunxA14oEo4zS2fFpjhbXGrPN+Pc6yYvB7bHw1sbQmdYiLO2rxhswFBK4xGCi3vLH7aJpii+0+X9RSsTnEBO0RK8RoPR2xkDNYXW0a1/AwnAaak2bYj0nyDuqJg37AS/oUOb1vdeV3ZYBHlx2BeYopb3qrr4hWKyEoB0Cj4GUfAAAAFQDdJ5ecc5ETygm9XhUUj7x91BVbMwAAAIEA3LP6y0wGAOTGTdHT9uHV9ZnnWPry3FD498XUhfd/8katSmv9dBqFZ5BSlmylNhNOGN/dgGvIysah3TjyiVgAhMIDSxyWXeNKylfCPrSiGgzM8sPtvUHAKjCr4YeqDBRpE2nuYpznop73ZNQ+pIgZqdMFXs4mhUw8Ai2Xc/5SU50AAACAdpxXHRObj5kiSZgGAlvwkslKIUteCSyoGibmNskfiBNzJY3St95HEK+e2xMQboTkyY3hrBqloLoBzSVdWeHcA4Dy35X8VTnKqPND6sF4ZlGbbCJ4i7j+NZNvY9YE7WlAvhx04nuXVm8WrfYDcMosawu6xUqwt2jyqxyZoW4D7vM=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDPMZg3FopBCtnsfqhDL1jxsml2miVdyRUl9+VLqXB9qIsKgdugZ5C55RjTiQ22wsa2Eh+b3yyIsvxFblaHxpQzIRnwsGaUu0F4hFCZXlaF5fc3O+m2QULrdxgV3MgQyWL48mVBiOB+GPPbs7QmzI86NB7uKRLNDd/a1TptTIakCXZG8IzEbICTS8L5pdUZ9xNLEft03pnuhGY/GBZ92mu+wYkGzptfYkjUD3tquOMvoARgvTTX7p7aXxFfSocK0lHZFLYlKMJRVt54wVcnxHlL5CKemFAnA9S+D8LZ5wZeoUJSE/kn/yvpO8XUisytKZyOmRFK/G3+cUWh6Pd7qYdOok4cofVXlyuTOw2mI5oI0U9r4iP+OK/lmhxRulzsX5W/l0DkEwkU1RuSRlvwu5f9EnfGb0i2T/oUti+RAH1DryJz8HsYqwWh73E/eQA3Syq8QjnIsYEPvoPNycnSAARw/gUeutemNV7jA6AoH96WGXVckMWfsvjlKleAA9RzgPc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDEVvjAxgU/tcZsEFgZpofCf1HGzA0uJ8gV8vzQgeRwbkH+VzA0Knn/ZxPSYccws9QCPb2xcCpDQhETAIG5RKHc=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAbPQMlsrcdV/DtPY+pi9Fcm1KJrxSB0LaYtXxUu2kxn", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50712 10.31.11.210 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50712 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_service_mgr": "systemd", "ansible_local": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12755 1727204076.98596: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset <<< 12755 1727204076.98608: stdout chunk (state=3): >>># destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json <<< 12755 1727204076.98626: stdout chunk (state=3): >>># cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid <<< 12755 1727204076.98643: stdout chunk (state=3): >>># cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common <<< 12755 1727204076.98647: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing <<< 12755 1727204076.98712: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation <<< 12755 1727204076.98718: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 <<< 12755 1727204076.98721: stdout chunk (state=3): >>># cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle <<< 12755 1727204076.98758: stdout chunk (state=3): >>># cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version <<< 12755 1727204076.98780: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos <<< 12755 1727204076.98805: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor <<< 12755 1727204076.98835: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 12755 1727204076.99339: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 12755 1727204076.99343: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 12755 1727204076.99345: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 12755 1727204076.99385: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 12755 1727204076.99393: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 12755 1727204076.99397: stdout chunk (state=3): >>># destroy ntpath <<< 12755 1727204076.99465: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 12755 1727204076.99468: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 12755 1727204076.99494: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 12755 1727204076.99541: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 12755 1727204076.99611: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 12755 1727204076.99672: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 12755 1727204076.99675: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process <<< 12755 1727204076.99726: stdout chunk (state=3): >>># destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl <<< 12755 1727204076.99730: stdout chunk (state=3): >>># destroy datetime # destroy subprocess <<< 12755 1727204076.99795: stdout chunk (state=3): >>># destroy base64 # destroy _ssl <<< 12755 1727204076.99798: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd <<< 12755 1727204076.99823: stdout chunk (state=3): >>># destroy termios # destroy errno # destroy json <<< 12755 1727204076.99853: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 12755 1727204076.99964: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian <<< 12755 1727204077.00050: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 12755 1727204077.00214: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator <<< 12755 1727204077.00295: stdout chunk (state=3): >>># cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 12755 1727204077.00564: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 12755 1727204077.00568: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 12755 1727204077.00599: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 12755 1727204077.00700: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 12755 1727204077.00760: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 12755 1727204077.00764: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 12755 1727204077.00810: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools <<< 12755 1727204077.00834: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 12755 1727204077.00859: stdout chunk (state=3): >>># clear sys.audit hooks <<< 12755 1727204077.01331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204077.01380: stderr chunk (state=3): >>><<< 12755 1727204077.01397: stdout chunk (state=3): >>><<< 12755 1727204077.01563: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a900c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8fdbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a900ea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e210a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e21fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e5fec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e5ff80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e978c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e97f50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e77b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e752b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e5d070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8ebb890> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8eba4b0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e762a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8eb8bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8eec800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e5c2f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8eeccb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8eecb60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8eecf50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e5ae10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8eed610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8eed2e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8eee510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8f08740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8f09e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8f0ad80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8f0b3e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8f0a2d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8f0be30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8f0b560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8eee570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8c4bd40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8c74860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8c745c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8c74890> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8c74a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8c49ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8c76180> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8c74e00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8eeec60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8c9e510> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8cba690> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8cef410> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8d19bb0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8cef530> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8cbb320> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8af84a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8cb96d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8c770b0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f29a8cb97f0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_dvu4qrdz/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8b660c0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8b3d040> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8b3c1a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8b3ffb0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8b95be0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8b95970> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8b95280> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8b959d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8b66b70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8b96960> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8b96ba0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8b97020> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a89fce30> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a89fea50> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a89ff410> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a005f0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a030e0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8a03230> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a013a0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a070b0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a05b80> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a058e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a07c20> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a018b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8a4b1d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a4b320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8a50ef0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a50cb0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8a53440> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a515e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a5ac30> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a535c0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8a5bef0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8a5ba70> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8a5bd40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a4b620> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8a5ebd0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8a5ffe0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a5d370> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a8a5e720> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a5cf20> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a88e8260> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a88e90a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8a635f0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a88e8e60> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a88eb6b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a88f1b80> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a88f24e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8e5ffb0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a88f13d0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a88f26c0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8986900> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a88fc530> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a88fa6c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a88fa510> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a89890d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a7eb3f50> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a7eb8320> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a89610a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a8960380> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a898b380> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a898acc0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a7ebb2c0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a7ebab70> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a7ebad50> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a7eb9fa0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a7ebb3e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a7f25f10> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a7ebbef0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a898ae70> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a7f27ad0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a7f26c00> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a7f56240> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a7f3eed0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a7d6deb0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a7f3fa10> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f29a7d96ed0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a7d95cd0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f29a7d94c50> {"ansible_facts": {"ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec295f006a273ed037dc1f179c04a840", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "36", "epoch": "1727204076", "epoch_int": "1727204076", "date": "2024-09-24", "time": "14:54:36", "iso8601_micro": "2024-09-24T18:54:36.969450Z", "iso8601": "2024-09-24T18:54:36Z", "iso8601_basic": "20240924T145436969450", "iso8601_basic_short": "20240924T145436", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPMunxA14oEo4zS2fFpjhbXGrPN+Pc6yYvB7bHw1sbQmdYiLO2rxhswFBK4xGCi3vLH7aJpii+0+X9RSsTnEBO0RK8RoPR2xkDNYXW0a1/AwnAaak2bYj0nyDuqJg37AS/oUOb1vdeV3ZYBHlx2BeYopb3qrr4hWKyEoB0Cj4GUfAAAAFQDdJ5ecc5ETygm9XhUUj7x91BVbMwAAAIEA3LP6y0wGAOTGTdHT9uHV9ZnnWPry3FD498XUhfd/8katSmv9dBqFZ5BSlmylNhNOGN/dgGvIysah3TjyiVgAhMIDSxyWXeNKylfCPrSiGgzM8sPtvUHAKjCr4YeqDBRpE2nuYpznop73ZNQ+pIgZqdMFXs4mhUw8Ai2Xc/5SU50AAACAdpxXHRObj5kiSZgGAlvwkslKIUteCSyoGibmNskfiBNzJY3St95HEK+e2xMQboTkyY3hrBqloLoBzSVdWeHcA4Dy35X8VTnKqPND6sF4ZlGbbCJ4i7j+NZNvY9YE7WlAvhx04nuXVm8WrfYDcMosawu6xUqwt2jyqxyZoW4D7vM=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDPMZg3FopBCtnsfqhDL1jxsml2miVdyRUl9+VLqXB9qIsKgdugZ5C55RjTiQ22wsa2Eh+b3yyIsvxFblaHxpQzIRnwsGaUu0F4hFCZXlaF5fc3O+m2QULrdxgV3MgQyWL48mVBiOB+GPPbs7QmzI86NB7uKRLNDd/a1TptTIakCXZG8IzEbICTS8L5pdUZ9xNLEft03pnuhGY/GBZ92mu+wYkGzptfYkjUD3tquOMvoARgvTTX7p7aXxFfSocK0lHZFLYlKMJRVt54wVcnxHlL5CKemFAnA9S+D8LZ5wZeoUJSE/kn/yvpO8XUisytKZyOmRFK/G3+cUWh6Pd7qYdOok4cofVXlyuTOw2mI5oI0U9r4iP+OK/lmhxRulzsX5W/l0DkEwkU1RuSRlvwu5f9EnfGb0i2T/oUti+RAH1DryJz8HsYqwWh73E/eQA3Syq8QjnIsYEPvoPNycnSAARw/gUeutemNV7jA6AoH96WGXVckMWfsvjlKleAA9RzgPc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDEVvjAxgU/tcZsEFgZpofCf1HGzA0uJ8gV8vzQgeRwbkH+VzA0Knn/ZxPSYccws9QCPb2xcCpDQhETAIG5RKHc=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAbPQMlsrcdV/DtPY+pi9Fcm1KJrxSB0LaYtXxUu2kxn", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50712 10.31.11.210 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50712 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_service_mgr": "systemd", "ansible_local": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 12755 1727204077.03463: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204076.488405-12859-83052676305940/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204077.03467: _low_level_execute_command(): starting 12755 1727204077.03469: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204076.488405-12859-83052676305940/ > /dev/null 2>&1 && sleep 0' 12755 1727204077.03567: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204077.03581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204077.03599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204077.03624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204077.03687: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204077.03753: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204077.03770: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204077.03825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204077.03878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204077.06021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204077.06025: stdout chunk (state=3): >>><<< 12755 1727204077.06028: stderr chunk (state=3): >>><<< 12755 1727204077.06295: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204077.06299: handler run complete 12755 1727204077.06301: variable 'ansible_facts' from source: unknown 12755 1727204077.06303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204077.06684: variable 'ansible_facts' from source: unknown 12755 1727204077.06775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204077.06866: attempt loop complete, returning result 12755 1727204077.06970: _execute() done 12755 1727204077.07294: dumping result to json 12755 1727204077.07298: done dumping result, returning 12755 1727204077.07301: done running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [12b410aa-8751-72e9-1a19-0000000001cd] 12755 1727204077.07303: sending task result for task 12b410aa-8751-72e9-1a19-0000000001cd 12755 1727204077.07451: done sending task result for task 12b410aa-8751-72e9-1a19-0000000001cd 12755 1727204077.07455: WORKER PROCESS EXITING ok: [managed-node1] 12755 1727204077.07738: no more pending results, returning what we have 12755 1727204077.07786: results queue empty 12755 1727204077.07788: checking for any_errors_fatal 12755 1727204077.07792: done checking for any_errors_fatal 12755 1727204077.07793: checking for max_fail_percentage 12755 1727204077.07795: done checking for max_fail_percentage 12755 1727204077.07796: checking to see if all hosts have failed and the running result is not ok 12755 1727204077.07797: done checking to see if all hosts have failed 12755 1727204077.07798: getting the remaining hosts for this loop 12755 1727204077.07799: done getting the remaining hosts for this loop 12755 1727204077.07804: getting the next task for host managed-node1 12755 1727204077.07814: done getting next task for host managed-node1 12755 1727204077.07816: ^ task is: TASK: Check if system is ostree 12755 1727204077.07819: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204077.07823: getting variables 12755 1727204077.07825: in VariableManager get_vars() 12755 1727204077.07855: Calling all_inventory to load vars for managed-node1 12755 1727204077.07858: Calling groups_inventory to load vars for managed-node1 12755 1727204077.07862: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204077.07881: Calling all_plugins_play to load vars for managed-node1 12755 1727204077.07884: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204077.07888: Calling groups_plugins_play to load vars for managed-node1 12755 1727204077.08169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204077.08491: done with get_vars() 12755 1727204077.08509: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.669) 0:00:02.322 ***** 12755 1727204077.08724: entering _queue_task() for managed-node1/stat 12755 1727204077.09120: worker is 1 (out of 1 available) 12755 1727204077.09135: exiting _queue_task() for managed-node1/stat 12755 1727204077.09149: done queuing things up, now waiting for results queue to drain 12755 1727204077.09150: waiting for pending results... 12755 1727204077.09426: running TaskExecutor() for managed-node1/TASK: Check if system is ostree 12755 1727204077.09595: in run() - task 12b410aa-8751-72e9-1a19-0000000001cf 12755 1727204077.09599: variable 'ansible_search_path' from source: unknown 12755 1727204077.09602: variable 'ansible_search_path' from source: unknown 12755 1727204077.09605: calling self._execute() 12755 1727204077.09686: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204077.09702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204077.09717: variable 'omit' from source: magic vars 12755 1727204077.10553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204077.10985: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204077.11062: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204077.11105: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204077.11170: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204077.11263: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204077.11314: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204077.11388: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204077.11398: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204077.11555: Evaluated conditional (not __network_is_ostree is defined): True 12755 1727204077.11568: variable 'omit' from source: magic vars 12755 1727204077.11639: variable 'omit' from source: magic vars 12755 1727204077.11691: variable 'omit' from source: magic vars 12755 1727204077.11761: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204077.11951: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204077.11957: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204077.11960: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204077.11963: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204077.11974: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204077.11985: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204077.11998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204077.12208: Set connection var ansible_connection to ssh 12755 1727204077.12223: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204077.12232: Set connection var ansible_shell_type to sh 12755 1727204077.12252: Set connection var ansible_timeout to 10 12755 1727204077.12279: Set connection var ansible_shell_executable to /bin/sh 12755 1727204077.12283: Set connection var ansible_pipelining to False 12755 1727204077.12388: variable 'ansible_shell_executable' from source: unknown 12755 1727204077.12394: variable 'ansible_connection' from source: unknown 12755 1727204077.12397: variable 'ansible_module_compression' from source: unknown 12755 1727204077.12400: variable 'ansible_shell_type' from source: unknown 12755 1727204077.12404: variable 'ansible_shell_executable' from source: unknown 12755 1727204077.12406: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204077.12408: variable 'ansible_pipelining' from source: unknown 12755 1727204077.12410: variable 'ansible_timeout' from source: unknown 12755 1727204077.12413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204077.12609: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204077.12617: variable 'omit' from source: magic vars 12755 1727204077.12620: starting attempt loop 12755 1727204077.12623: running the handler 12755 1727204077.12625: _low_level_execute_command(): starting 12755 1727204077.12628: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204077.13424: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204077.13442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204077.13502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204077.13604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204077.13608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204077.13624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204077.13645: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204077.13667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204077.13735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204077.15494: stdout chunk (state=3): >>>/root <<< 12755 1727204077.15693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204077.15698: stdout chunk (state=3): >>><<< 12755 1727204077.15700: stderr chunk (state=3): >>><<< 12755 1727204077.15725: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204077.15842: _low_level_execute_command(): starting 12755 1727204077.15846: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204077.1573813-12891-30496680506158 `" && echo ansible-tmp-1727204077.1573813-12891-30496680506158="` echo /root/.ansible/tmp/ansible-tmp-1727204077.1573813-12891-30496680506158 `" ) && sleep 0' 12755 1727204077.16436: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204077.16524: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204077.16548: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204077.16578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204077.16653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204077.18720: stdout chunk (state=3): >>>ansible-tmp-1727204077.1573813-12891-30496680506158=/root/.ansible/tmp/ansible-tmp-1727204077.1573813-12891-30496680506158 <<< 12755 1727204077.18932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204077.18935: stdout chunk (state=3): >>><<< 12755 1727204077.18938: stderr chunk (state=3): >>><<< 12755 1727204077.19095: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204077.1573813-12891-30496680506158=/root/.ansible/tmp/ansible-tmp-1727204077.1573813-12891-30496680506158 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204077.19098: variable 'ansible_module_compression' from source: unknown 12755 1727204077.19101: ANSIBALLZ: Using lock for stat 12755 1727204077.19103: ANSIBALLZ: Acquiring lock 12755 1727204077.19105: ANSIBALLZ: Lock acquired: 139630693733424 12755 1727204077.19107: ANSIBALLZ: Creating module 12755 1727204077.38274: ANSIBALLZ: Writing module into payload 12755 1727204077.38403: ANSIBALLZ: Writing module 12755 1727204077.38431: ANSIBALLZ: Renaming module 12755 1727204077.38443: ANSIBALLZ: Done creating module 12755 1727204077.38465: variable 'ansible_facts' from source: unknown 12755 1727204077.38572: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204077.1573813-12891-30496680506158/AnsiballZ_stat.py 12755 1727204077.39017: Sending initial data 12755 1727204077.39033: Sent initial data (152 bytes) 12755 1727204077.40535: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204077.40746: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204077.40828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204077.42616: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 12755 1727204077.42634: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204077.42657: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204077.42714: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpdgc9jc52 /root/.ansible/tmp/ansible-tmp-1727204077.1573813-12891-30496680506158/AnsiballZ_stat.py <<< 12755 1727204077.42809: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204077.1573813-12891-30496680506158/AnsiballZ_stat.py" <<< 12755 1727204077.43111: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpdgc9jc52" to remote "/root/.ansible/tmp/ansible-tmp-1727204077.1573813-12891-30496680506158/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204077.1573813-12891-30496680506158/AnsiballZ_stat.py" <<< 12755 1727204077.45494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204077.45502: stderr chunk (state=3): >>><<< 12755 1727204077.45505: stdout chunk (state=3): >>><<< 12755 1727204077.45507: done transferring module to remote 12755 1727204077.45510: _low_level_execute_command(): starting 12755 1727204077.45512: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204077.1573813-12891-30496680506158/ /root/.ansible/tmp/ansible-tmp-1727204077.1573813-12891-30496680506158/AnsiballZ_stat.py && sleep 0' 12755 1727204077.46784: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204077.46948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204077.46962: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204077.47162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204077.47187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204077.49233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204077.49317: stderr chunk (state=3): >>><<< 12755 1727204077.49321: stdout chunk (state=3): >>><<< 12755 1727204077.49550: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204077.49554: _low_level_execute_command(): starting 12755 1727204077.49557: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204077.1573813-12891-30496680506158/AnsiballZ_stat.py && sleep 0' 12755 1727204077.50468: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204077.50482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204077.50610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204077.50624: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204077.50649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204077.50699: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204077.50837: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204077.50871: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204077.50959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204077.53240: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 12755 1727204077.53288: stdout chunk (state=3): >>>import _imp # builtin <<< 12755 1727204077.53424: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # <<< 12755 1727204077.53428: stdout chunk (state=3): >>>import 'posix' # <<< 12755 1727204077.53481: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 12755 1727204077.53484: stdout chunk (state=3): >>># installing zipimport hook import 'time' # <<< 12755 1727204077.53507: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 12755 1727204077.53543: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 12755 1727204077.53642: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # <<< 12755 1727204077.53646: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 12755 1727204077.53675: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fd2c4d0> <<< 12755 1727204077.53681: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fcfbad0> <<< 12755 1727204077.53826: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 12755 1727204077.53846: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fd2ea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 12755 1727204077.53912: stdout chunk (state=3): >>>import '_collections_abc' # <<< 12755 1727204077.54076: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 12755 1727204077.54081: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 12755 1727204077.54088: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 12755 1727204077.54105: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fb410a0> <<< 12755 1727204077.54181: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 12755 1727204077.54184: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 12755 1727204077.54258: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fb41fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 12755 1727204077.54627: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 12755 1727204077.54631: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 12755 1727204077.54636: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 12755 1727204077.54650: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 12755 1727204077.54666: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fb7fec0> <<< 12755 1727204077.54711: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 12755 1727204077.54818: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fb7ff80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 12755 1727204077.54938: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fbb78c0> <<< 12755 1727204077.54947: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 12755 1727204077.54950: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fbb7f50> <<< 12755 1727204077.54963: stdout chunk (state=3): >>>import '_collections' # <<< 12755 1727204077.55027: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fb97b60> <<< 12755 1727204077.55031: stdout chunk (state=3): >>>import '_functools' # <<< 12755 1727204077.55136: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fb952b0> <<< 12755 1727204077.55152: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fb7d070> <<< 12755 1727204077.55175: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 12755 1727204077.55274: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 12755 1727204077.55279: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 12755 1727204077.55367: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fbdb890> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fbda4b0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fb962a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fbd8bc0> <<< 12755 1727204077.55419: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 12755 1727204077.55473: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fc0c800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fb7c2f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 12755 1727204077.55593: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 12755 1727204077.55598: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230fc0ccb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fc0cb60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230fc0cf50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fb7ae10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 12755 1727204077.55614: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 12755 1727204077.55643: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 12755 1727204077.55720: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fc0d610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fc0d2e0> import 'importlib.machinery' # <<< 12755 1727204077.55931: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fc0e510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fc28740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 12755 1727204077.56026: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230fc29e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fc2ad80> <<< 12755 1727204077.56050: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230fc2b3e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fc2a2d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 12755 1727204077.56070: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230fc2be30> <<< 12755 1727204077.56162: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fc2b560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fc0e570> <<< 12755 1727204077.56168: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 12755 1727204077.56183: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 12755 1727204077.56267: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f9f3d40> <<< 12755 1727204077.56286: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 12755 1727204077.56336: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230fa1c860> <<< 12755 1727204077.56342: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fa1c5c0> <<< 12755 1727204077.56478: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230fa1c890> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230fa1ca70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f9f1ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 12755 1727204077.56546: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 12755 1727204077.56598: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 12755 1727204077.56602: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 12755 1727204077.56604: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fa1e180> <<< 12755 1727204077.56697: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fa1ce00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fc0ec60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 12755 1727204077.56719: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 12755 1727204077.56822: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fa46510> <<< 12755 1727204077.56863: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 12755 1727204077.56880: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 12755 1727204077.56929: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 12755 1727204077.56967: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fa62690> <<< 12755 1727204077.57036: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 12755 1727204077.57099: stdout chunk (state=3): >>>import 'ntpath' # <<< 12755 1727204077.57142: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fa97410> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 12755 1727204077.57179: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 12755 1727204077.57361: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 12755 1727204077.57364: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fac1bb0> <<< 12755 1727204077.57422: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fa97530> <<< 12755 1727204077.57465: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fa63320> <<< 12755 1727204077.57502: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 12755 1727204077.57519: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f89c4a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fa616d0> <<< 12755 1727204077.57530: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fa1f0b0> <<< 12755 1727204077.57670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f230fa617f0> <<< 12755 1727204077.57732: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_k2sj8y80/ansible_stat_payload.zip' <<< 12755 1727204077.57736: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.57997: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 12755 1727204077.58012: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 12755 1727204077.58046: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 12755 1727204077.58081: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 12755 1727204077.58206: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f8f60c0> import '_typing' # <<< 12755 1727204077.58295: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f8cd040> <<< 12755 1727204077.58327: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f8cc1a0> # zipimport: zlib available <<< 12755 1727204077.58394: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 12755 1727204077.58398: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.58401: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.58412: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 12755 1727204077.60195: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.61340: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f8cffe0> <<< 12755 1727204077.61360: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 12755 1727204077.61377: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 12755 1727204077.61429: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 12755 1727204077.61549: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f921b80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f921910> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f921220> <<< 12755 1727204077.61553: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 12755 1727204077.61641: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f921c70> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f8f6b70> import 'atexit' # <<< 12755 1727204077.61673: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f922930> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f922b70> <<< 12755 1727204077.61753: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 12755 1727204077.61799: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f9230b0> import 'pwd' # <<< 12755 1727204077.61856: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 12755 1727204077.62001: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f784e90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f786ab0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f7873b0> <<< 12755 1727204077.62021: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 12755 1727204077.62094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f788590> <<< 12755 1727204077.62102: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 12755 1727204077.62206: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 12755 1727204077.62282: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f78b080> <<< 12755 1727204077.62286: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 12755 1727204077.62292: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f78b1d0> <<< 12755 1727204077.62389: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f789340> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 12755 1727204077.62398: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 12755 1727204077.62447: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 12755 1727204077.62451: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f78eff0> import '_tokenize' # <<< 12755 1727204077.62536: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f78dac0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f78d820> <<< 12755 1727204077.62544: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 12755 1727204077.62678: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f78ff50> <<< 12755 1727204077.62685: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f789850> <<< 12755 1727204077.62776: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f7d7110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f7d7290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 12755 1727204077.62830: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 12755 1727204077.62833: stdout chunk (state=3): >>>import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f7d8e60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f7d8c20> <<< 12755 1727204077.62883: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 12755 1727204077.63032: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f7db3b0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f7d9550> <<< 12755 1727204077.63035: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 12755 1727204077.63126: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 12755 1727204077.63161: stdout chunk (state=3): >>>import '_string' # <<< 12755 1727204077.63174: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f7e2ba0> <<< 12755 1727204077.63336: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f7db530> <<< 12755 1727204077.63407: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f7e3e60><<< 12755 1727204077.63713: stdout chunk (state=3): >>> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 12755 1727204077.63718: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f7e3a10> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f7e3f20> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f7d7590> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f7e7680> <<< 12755 1727204077.64008: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f7e8650> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f7e5df0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f7e7170> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f7e59d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 12755 1727204077.64059: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.64162: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.64181: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 12755 1727204077.64201: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.64302: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 12755 1727204077.64367: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.64512: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.65204: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.65896: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 12755 1727204077.65912: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 12755 1727204077.65966: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 12755 1727204077.66022: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f870830> <<< 12755 1727204077.66130: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 12755 1727204077.66188: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f871580> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f78da30> <<< 12755 1727204077.66228: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 12755 1727204077.66245: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.66297: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 12755 1727204077.66375: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.66470: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.66660: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 12755 1727204077.66733: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f871340> # zipimport: zlib available <<< 12755 1727204077.67281: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.67829: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.68029: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 12755 1727204077.68367: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 12755 1727204077.68461: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 12755 1727204077.68732: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.69021: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 12755 1727204077.69104: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 12755 1727204077.69215: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f873f50> # zipimport: zlib available <<< 12755 1727204077.69314: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.69450: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 12755 1727204077.69454: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 12755 1727204077.69542: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 12755 1727204077.69687: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f67df10> <<< 12755 1727204077.69772: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f67e840> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f873290> # zipimport: zlib available <<< 12755 1727204077.69808: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.69848: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 12755 1727204077.69867: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.70030: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12755 1727204077.70034: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.70102: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 12755 1727204077.70152: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 12755 1727204077.70477: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f67d6d0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f67ea80> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available <<< 12755 1727204077.70492: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.70520: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.70732: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 12755 1727204077.70736: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 12755 1727204077.70751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 12755 1727204077.70938: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f70ed50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f688b30> <<< 12755 1727204077.70957: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f686b70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f6869c0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 12755 1727204077.70972: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.71002: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.71036: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 12755 1727204077.71111: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 12755 1727204077.71282: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 12755 1727204077.71298: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.71528: stdout chunk (state=3): >>># zipimport: zlib available <<< 12755 1727204077.71687: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 12755 1727204077.71809: stdout chunk (state=3): >>># destroy __main__ <<< 12755 1727204077.72046: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 12755 1727204077.72066: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings <<< 12755 1727204077.72342: stdout chunk (state=3): >>># cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 12755 1727204077.72897: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 12755 1727204077.72905: stdout chunk (state=3): >>># destroy selectors # destroy errno # destroy array # destroy datetime <<< 12755 1727204077.72910: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 12755 1727204077.72915: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes <<< 12755 1727204077.72922: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 12755 1727204077.72925: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime <<< 12755 1727204077.72929: stdout chunk (state=3): >>># cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 12755 1727204077.73070: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 12755 1727204077.73073: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 12755 1727204077.73076: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat <<< 12755 1727204077.73166: stdout chunk (state=3): >>># cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external <<< 12755 1727204077.73191: stdout chunk (state=3): >>># cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 12755 1727204077.73287: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 12755 1727204077.73376: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 12755 1727204077.73383: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 12755 1727204077.73674: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re <<< 12755 1727204077.73677: stdout chunk (state=3): >>># destroy itertools <<< 12755 1727204077.73679: stdout chunk (state=3): >>># destroy _abc <<< 12755 1727204077.73681: stdout chunk (state=3): >>># destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 12755 1727204077.73733: stdout chunk (state=3): >>># clear sys.audit hooks <<< 12755 1727204077.74263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204077.74267: stdout chunk (state=3): >>><<< 12755 1727204077.74276: stderr chunk (state=3): >>><<< 12755 1727204077.74492: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fd2c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fcfbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fd2ea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fb410a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fb41fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fb7fec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fb7ff80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fbb78c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fbb7f50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fb97b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fb952b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fb7d070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fbdb890> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fbda4b0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fb962a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fbd8bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fc0c800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fb7c2f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230fc0ccb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fc0cb60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230fc0cf50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fb7ae10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fc0d610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fc0d2e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fc0e510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fc28740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230fc29e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fc2ad80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230fc2b3e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fc2a2d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230fc2be30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fc2b560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fc0e570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f9f3d40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230fa1c860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fa1c5c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230fa1c890> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230fa1ca70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f9f1ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fa1e180> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fa1ce00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fc0ec60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fa46510> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fa62690> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fa97410> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fac1bb0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fa97530> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fa63320> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f89c4a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fa616d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230fa1f0b0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f230fa617f0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_k2sj8y80/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f8f60c0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f8cd040> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f8cc1a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f8cffe0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f921b80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f921910> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f921220> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f921c70> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f8f6b70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f922930> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f922b70> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f9230b0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f784e90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f786ab0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f7873b0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f788590> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f78b080> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f78b1d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f789340> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f78eff0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f78dac0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f78d820> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f78ff50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f789850> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f7d7110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f7d7290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f7d8e60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f7d8c20> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f7db3b0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f7d9550> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f7e2ba0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f7db530> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f7e3e60> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f7e3a10> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f7e3f20> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f7d7590> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f7e7680> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f7e8650> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f7e5df0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f7e7170> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f7e59d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f870830> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f871580> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f78da30> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f871340> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f873f50> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f67df10> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f67e840> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f873290> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f230f67d6d0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f67ea80> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f70ed50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f688b30> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f686b70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f230f6869c0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 12755 1727204077.75604: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204077.1573813-12891-30496680506158/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204077.75608: _low_level_execute_command(): starting 12755 1727204077.75611: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204077.1573813-12891-30496680506158/ > /dev/null 2>&1 && sleep 0' 12755 1727204077.76034: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204077.76038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204077.76041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204077.76043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204077.76167: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204077.76218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204077.76318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204077.76325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204077.76395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204077.78445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204077.78541: stderr chunk (state=3): >>><<< 12755 1727204077.78545: stdout chunk (state=3): >>><<< 12755 1727204077.78548: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204077.78550: handler run complete 12755 1727204077.78576: attempt loop complete, returning result 12755 1727204077.78579: _execute() done 12755 1727204077.78582: dumping result to json 12755 1727204077.78588: done dumping result, returning 12755 1727204077.78758: done running TaskExecutor() for managed-node1/TASK: Check if system is ostree [12b410aa-8751-72e9-1a19-0000000001cf] 12755 1727204077.78762: sending task result for task 12b410aa-8751-72e9-1a19-0000000001cf 12755 1727204077.78837: done sending task result for task 12b410aa-8751-72e9-1a19-0000000001cf 12755 1727204077.78840: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 12755 1727204077.78934: no more pending results, returning what we have 12755 1727204077.78938: results queue empty 12755 1727204077.78939: checking for any_errors_fatal 12755 1727204077.78949: done checking for any_errors_fatal 12755 1727204077.78950: checking for max_fail_percentage 12755 1727204077.78951: done checking for max_fail_percentage 12755 1727204077.78952: checking to see if all hosts have failed and the running result is not ok 12755 1727204077.78953: done checking to see if all hosts have failed 12755 1727204077.78954: getting the remaining hosts for this loop 12755 1727204077.78955: done getting the remaining hosts for this loop 12755 1727204077.78960: getting the next task for host managed-node1 12755 1727204077.78967: done getting next task for host managed-node1 12755 1727204077.78970: ^ task is: TASK: Set flag to indicate system is ostree 12755 1727204077.78973: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204077.78976: getting variables 12755 1727204077.78979: in VariableManager get_vars() 12755 1727204077.79013: Calling all_inventory to load vars for managed-node1 12755 1727204077.79020: Calling groups_inventory to load vars for managed-node1 12755 1727204077.79024: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204077.79037: Calling all_plugins_play to load vars for managed-node1 12755 1727204077.79041: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204077.79045: Calling groups_plugins_play to load vars for managed-node1 12755 1727204077.79365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204077.80092: done with get_vars() 12755 1727204077.80106: done getting variables 12755 1727204077.80249: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.716) 0:00:03.038 ***** 12755 1727204077.80280: entering _queue_task() for managed-node1/set_fact 12755 1727204077.80283: Creating lock for set_fact 12755 1727204077.80672: worker is 1 (out of 1 available) 12755 1727204077.80686: exiting _queue_task() for managed-node1/set_fact 12755 1727204077.80773: done queuing things up, now waiting for results queue to drain 12755 1727204077.80775: waiting for pending results... 12755 1727204077.80931: running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree 12755 1727204077.81095: in run() - task 12b410aa-8751-72e9-1a19-0000000001d0 12755 1727204077.81104: variable 'ansible_search_path' from source: unknown 12755 1727204077.81108: variable 'ansible_search_path' from source: unknown 12755 1727204077.81208: calling self._execute() 12755 1727204077.81257: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204077.81270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204077.81288: variable 'omit' from source: magic vars 12755 1727204077.81768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204077.82120: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204077.82178: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204077.82236: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204077.82296: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204077.82396: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204077.82496: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204077.82500: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204077.82531: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204077.82684: Evaluated conditional (not __network_is_ostree is defined): True 12755 1727204077.82699: variable 'omit' from source: magic vars 12755 1727204077.82763: variable 'omit' from source: magic vars 12755 1727204077.82930: variable '__ostree_booted_stat' from source: set_fact 12755 1727204077.83001: variable 'omit' from source: magic vars 12755 1727204077.83037: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204077.83107: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204077.83112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204077.83141: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204077.83159: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204077.83215: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204077.83221: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204077.83278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204077.83369: Set connection var ansible_connection to ssh 12755 1727204077.83389: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204077.83401: Set connection var ansible_shell_type to sh 12755 1727204077.83425: Set connection var ansible_timeout to 10 12755 1727204077.83443: Set connection var ansible_shell_executable to /bin/sh 12755 1727204077.83456: Set connection var ansible_pipelining to False 12755 1727204077.83496: variable 'ansible_shell_executable' from source: unknown 12755 1727204077.83500: variable 'ansible_connection' from source: unknown 12755 1727204077.83542: variable 'ansible_module_compression' from source: unknown 12755 1727204077.83545: variable 'ansible_shell_type' from source: unknown 12755 1727204077.83548: variable 'ansible_shell_executable' from source: unknown 12755 1727204077.83551: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204077.83553: variable 'ansible_pipelining' from source: unknown 12755 1727204077.83555: variable 'ansible_timeout' from source: unknown 12755 1727204077.83557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204077.83698: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204077.83724: variable 'omit' from source: magic vars 12755 1727204077.83760: starting attempt loop 12755 1727204077.83763: running the handler 12755 1727204077.83766: handler run complete 12755 1727204077.83779: attempt loop complete, returning result 12755 1727204077.83787: _execute() done 12755 1727204077.83826: dumping result to json 12755 1727204077.83829: done dumping result, returning 12755 1727204077.83831: done running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree [12b410aa-8751-72e9-1a19-0000000001d0] 12755 1727204077.83835: sending task result for task 12b410aa-8751-72e9-1a19-0000000001d0 ok: [managed-node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 12755 1727204077.84122: no more pending results, returning what we have 12755 1727204077.84125: results queue empty 12755 1727204077.84126: checking for any_errors_fatal 12755 1727204077.84134: done checking for any_errors_fatal 12755 1727204077.84135: checking for max_fail_percentage 12755 1727204077.84137: done checking for max_fail_percentage 12755 1727204077.84138: checking to see if all hosts have failed and the running result is not ok 12755 1727204077.84139: done checking to see if all hosts have failed 12755 1727204077.84140: getting the remaining hosts for this loop 12755 1727204077.84142: done getting the remaining hosts for this loop 12755 1727204077.84147: getting the next task for host managed-node1 12755 1727204077.84158: done getting next task for host managed-node1 12755 1727204077.84161: ^ task is: TASK: Fix CentOS6 Base repo 12755 1727204077.84165: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204077.84169: getting variables 12755 1727204077.84172: in VariableManager get_vars() 12755 1727204077.84310: Calling all_inventory to load vars for managed-node1 12755 1727204077.84314: Calling groups_inventory to load vars for managed-node1 12755 1727204077.84321: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204077.84393: done sending task result for task 12b410aa-8751-72e9-1a19-0000000001d0 12755 1727204077.84397: WORKER PROCESS EXITING 12755 1727204077.84412: Calling all_plugins_play to load vars for managed-node1 12755 1727204077.84419: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204077.84430: Calling groups_plugins_play to load vars for managed-node1 12755 1727204077.84664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204077.84964: done with get_vars() 12755 1727204077.84976: done getting variables 12755 1727204077.85122: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.048) 0:00:03.087 ***** 12755 1727204077.85155: entering _queue_task() for managed-node1/copy 12755 1727204077.85521: worker is 1 (out of 1 available) 12755 1727204077.85533: exiting _queue_task() for managed-node1/copy 12755 1727204077.85546: done queuing things up, now waiting for results queue to drain 12755 1727204077.85548: waiting for pending results... 12755 1727204077.85709: running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo 12755 1727204077.85839: in run() - task 12b410aa-8751-72e9-1a19-0000000001d2 12755 1727204077.85860: variable 'ansible_search_path' from source: unknown 12755 1727204077.85868: variable 'ansible_search_path' from source: unknown 12755 1727204077.85918: calling self._execute() 12755 1727204077.86014: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204077.86031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204077.86052: variable 'omit' from source: magic vars 12755 1727204077.86496: variable 'ansible_distribution' from source: facts 12755 1727204077.86534: Evaluated conditional (ansible_distribution == 'CentOS'): False 12755 1727204077.86537: when evaluation is False, skipping this task 12755 1727204077.86642: _execute() done 12755 1727204077.86645: dumping result to json 12755 1727204077.86648: done dumping result, returning 12755 1727204077.86651: done running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo [12b410aa-8751-72e9-1a19-0000000001d2] 12755 1727204077.86654: sending task result for task 12b410aa-8751-72e9-1a19-0000000001d2 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 12755 1727204077.86824: no more pending results, returning what we have 12755 1727204077.86828: results queue empty 12755 1727204077.86830: checking for any_errors_fatal 12755 1727204077.86835: done checking for any_errors_fatal 12755 1727204077.86836: checking for max_fail_percentage 12755 1727204077.86838: done checking for max_fail_percentage 12755 1727204077.86839: checking to see if all hosts have failed and the running result is not ok 12755 1727204077.86840: done checking to see if all hosts have failed 12755 1727204077.86841: getting the remaining hosts for this loop 12755 1727204077.86842: done getting the remaining hosts for this loop 12755 1727204077.86847: getting the next task for host managed-node1 12755 1727204077.86856: done getting next task for host managed-node1 12755 1727204077.86859: ^ task is: TASK: Include the task 'enable_epel.yml' 12755 1727204077.86863: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204077.86868: getting variables 12755 1727204077.86870: in VariableManager get_vars() 12755 1727204077.87012: Calling all_inventory to load vars for managed-node1 12755 1727204077.87019: Calling groups_inventory to load vars for managed-node1 12755 1727204077.87023: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204077.87030: done sending task result for task 12b410aa-8751-72e9-1a19-0000000001d2 12755 1727204077.87034: WORKER PROCESS EXITING 12755 1727204077.87044: Calling all_plugins_play to load vars for managed-node1 12755 1727204077.87048: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204077.87052: Calling groups_plugins_play to load vars for managed-node1 12755 1727204077.87386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204077.87679: done with get_vars() 12755 1727204077.87693: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.026) 0:00:03.113 ***** 12755 1727204077.87801: entering _queue_task() for managed-node1/include_tasks 12755 1727204077.88063: worker is 1 (out of 1 available) 12755 1727204077.88076: exiting _queue_task() for managed-node1/include_tasks 12755 1727204077.88264: done queuing things up, now waiting for results queue to drain 12755 1727204077.88266: waiting for pending results... 12755 1727204077.88609: running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' 12755 1727204077.88921: in run() - task 12b410aa-8751-72e9-1a19-0000000001d3 12755 1727204077.88925: variable 'ansible_search_path' from source: unknown 12755 1727204077.88929: variable 'ansible_search_path' from source: unknown 12755 1727204077.88932: calling self._execute() 12755 1727204077.89144: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204077.89209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204077.89231: variable 'omit' from source: magic vars 12755 1727204077.90635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204077.98738: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204077.98906: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204077.99500: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204077.99505: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204077.99508: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204077.99802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204077.99858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204077.99976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204078.00043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204078.00197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204078.00464: variable '__network_is_ostree' from source: set_fact 12755 1727204078.00519: Evaluated conditional (not __network_is_ostree | d(false)): True 12755 1727204078.00636: _execute() done 12755 1727204078.00639: dumping result to json 12755 1727204078.00642: done dumping result, returning 12755 1727204078.00644: done running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' [12b410aa-8751-72e9-1a19-0000000001d3] 12755 1727204078.00654: sending task result for task 12b410aa-8751-72e9-1a19-0000000001d3 12755 1727204078.00992: no more pending results, returning what we have 12755 1727204078.00999: in VariableManager get_vars() 12755 1727204078.01047: Calling all_inventory to load vars for managed-node1 12755 1727204078.01052: Calling groups_inventory to load vars for managed-node1 12755 1727204078.01057: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204078.01074: Calling all_plugins_play to load vars for managed-node1 12755 1727204078.01078: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204078.01083: Calling groups_plugins_play to load vars for managed-node1 12755 1727204078.01723: done sending task result for task 12b410aa-8751-72e9-1a19-0000000001d3 12755 1727204078.01727: WORKER PROCESS EXITING 12755 1727204078.01770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204078.02496: done with get_vars() 12755 1727204078.02507: variable 'ansible_search_path' from source: unknown 12755 1727204078.02508: variable 'ansible_search_path' from source: unknown 12755 1727204078.02556: we have included files to process 12755 1727204078.02557: generating all_blocks data 12755 1727204078.02559: done generating all_blocks data 12755 1727204078.02565: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 12755 1727204078.02567: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 12755 1727204078.02570: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 12755 1727204078.04754: done processing included file 12755 1727204078.04757: iterating over new_blocks loaded from include file 12755 1727204078.04759: in VariableManager get_vars() 12755 1727204078.04896: done with get_vars() 12755 1727204078.04899: filtering new block on tags 12755 1727204078.04931: done filtering new block on tags 12755 1727204078.04935: in VariableManager get_vars() 12755 1727204078.04950: done with get_vars() 12755 1727204078.04951: filtering new block on tags 12755 1727204078.04966: done filtering new block on tags 12755 1727204078.04969: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node1 12755 1727204078.04975: extending task lists for all hosts with included blocks 12755 1727204078.05361: done extending task lists 12755 1727204078.05363: done processing included files 12755 1727204078.05364: results queue empty 12755 1727204078.05365: checking for any_errors_fatal 12755 1727204078.05369: done checking for any_errors_fatal 12755 1727204078.05370: checking for max_fail_percentage 12755 1727204078.05372: done checking for max_fail_percentage 12755 1727204078.05373: checking to see if all hosts have failed and the running result is not ok 12755 1727204078.05374: done checking to see if all hosts have failed 12755 1727204078.05375: getting the remaining hosts for this loop 12755 1727204078.05376: done getting the remaining hosts for this loop 12755 1727204078.05379: getting the next task for host managed-node1 12755 1727204078.05384: done getting next task for host managed-node1 12755 1727204078.05386: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 12755 1727204078.05391: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204078.05394: getting variables 12755 1727204078.05396: in VariableManager get_vars() 12755 1727204078.05406: Calling all_inventory to load vars for managed-node1 12755 1727204078.05409: Calling groups_inventory to load vars for managed-node1 12755 1727204078.05412: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204078.05536: Calling all_plugins_play to load vars for managed-node1 12755 1727204078.05546: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204078.05552: Calling groups_plugins_play to load vars for managed-node1 12755 1727204078.06417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204078.07263: done with get_vars() 12755 1727204078.07276: done getting variables 12755 1727204078.07405: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 12755 1727204078.07665: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 39] ********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.200) 0:00:03.314 ***** 12755 1727204078.07840: entering _queue_task() for managed-node1/command 12755 1727204078.07842: Creating lock for command 12755 1727204078.08568: worker is 1 (out of 1 available) 12755 1727204078.08581: exiting _queue_task() for managed-node1/command 12755 1727204078.08597: done queuing things up, now waiting for results queue to drain 12755 1727204078.08599: waiting for pending results... 12755 1727204078.08911: running TaskExecutor() for managed-node1/TASK: Create EPEL 39 12755 1727204078.09362: in run() - task 12b410aa-8751-72e9-1a19-0000000001ed 12755 1727204078.09383: variable 'ansible_search_path' from source: unknown 12755 1727204078.09496: variable 'ansible_search_path' from source: unknown 12755 1727204078.09500: calling self._execute() 12755 1727204078.09551: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204078.09571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204078.09593: variable 'omit' from source: magic vars 12755 1727204078.10845: variable 'ansible_distribution' from source: facts 12755 1727204078.10849: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 12755 1727204078.10852: when evaluation is False, skipping this task 12755 1727204078.10855: _execute() done 12755 1727204078.10857: dumping result to json 12755 1727204078.10864: done dumping result, returning 12755 1727204078.10867: done running TaskExecutor() for managed-node1/TASK: Create EPEL 39 [12b410aa-8751-72e9-1a19-0000000001ed] 12755 1727204078.10870: sending task result for task 12b410aa-8751-72e9-1a19-0000000001ed 12755 1727204078.11055: done sending task result for task 12b410aa-8751-72e9-1a19-0000000001ed 12755 1727204078.11059: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 12755 1727204078.11143: no more pending results, returning what we have 12755 1727204078.11154: results queue empty 12755 1727204078.11156: checking for any_errors_fatal 12755 1727204078.11157: done checking for any_errors_fatal 12755 1727204078.11158: checking for max_fail_percentage 12755 1727204078.11160: done checking for max_fail_percentage 12755 1727204078.11161: checking to see if all hosts have failed and the running result is not ok 12755 1727204078.11162: done checking to see if all hosts have failed 12755 1727204078.11163: getting the remaining hosts for this loop 12755 1727204078.11165: done getting the remaining hosts for this loop 12755 1727204078.11283: getting the next task for host managed-node1 12755 1727204078.11298: done getting next task for host managed-node1 12755 1727204078.11302: ^ task is: TASK: Install yum-utils package 12755 1727204078.11306: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204078.11311: getting variables 12755 1727204078.11312: in VariableManager get_vars() 12755 1727204078.11347: Calling all_inventory to load vars for managed-node1 12755 1727204078.11350: Calling groups_inventory to load vars for managed-node1 12755 1727204078.11355: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204078.11371: Calling all_plugins_play to load vars for managed-node1 12755 1727204078.11375: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204078.11380: Calling groups_plugins_play to load vars for managed-node1 12755 1727204078.12133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204078.12864: done with get_vars() 12755 1727204078.12996: done getting variables 12755 1727204078.13221: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.054) 0:00:03.368 ***** 12755 1727204078.13258: entering _queue_task() for managed-node1/package 12755 1727204078.13260: Creating lock for package 12755 1727204078.13601: worker is 1 (out of 1 available) 12755 1727204078.13615: exiting _queue_task() for managed-node1/package 12755 1727204078.13629: done queuing things up, now waiting for results queue to drain 12755 1727204078.13631: waiting for pending results... 12755 1727204078.13921: running TaskExecutor() for managed-node1/TASK: Install yum-utils package 12755 1727204078.14198: in run() - task 12b410aa-8751-72e9-1a19-0000000001ee 12755 1727204078.14202: variable 'ansible_search_path' from source: unknown 12755 1727204078.14204: variable 'ansible_search_path' from source: unknown 12755 1727204078.14207: calling self._execute() 12755 1727204078.14263: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204078.14276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204078.14298: variable 'omit' from source: magic vars 12755 1727204078.14778: variable 'ansible_distribution' from source: facts 12755 1727204078.14846: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 12755 1727204078.14854: when evaluation is False, skipping this task 12755 1727204078.14857: _execute() done 12755 1727204078.14860: dumping result to json 12755 1727204078.14862: done dumping result, returning 12755 1727204078.14865: done running TaskExecutor() for managed-node1/TASK: Install yum-utils package [12b410aa-8751-72e9-1a19-0000000001ee] 12755 1727204078.14867: sending task result for task 12b410aa-8751-72e9-1a19-0000000001ee 12755 1727204078.15094: done sending task result for task 12b410aa-8751-72e9-1a19-0000000001ee 12755 1727204078.15098: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 12755 1727204078.15153: no more pending results, returning what we have 12755 1727204078.15157: results queue empty 12755 1727204078.15159: checking for any_errors_fatal 12755 1727204078.15167: done checking for any_errors_fatal 12755 1727204078.15168: checking for max_fail_percentage 12755 1727204078.15170: done checking for max_fail_percentage 12755 1727204078.15171: checking to see if all hosts have failed and the running result is not ok 12755 1727204078.15172: done checking to see if all hosts have failed 12755 1727204078.15173: getting the remaining hosts for this loop 12755 1727204078.15177: done getting the remaining hosts for this loop 12755 1727204078.15182: getting the next task for host managed-node1 12755 1727204078.15196: done getting next task for host managed-node1 12755 1727204078.15199: ^ task is: TASK: Enable EPEL 7 12755 1727204078.15204: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204078.15212: getting variables 12755 1727204078.15215: in VariableManager get_vars() 12755 1727204078.15250: Calling all_inventory to load vars for managed-node1 12755 1727204078.15254: Calling groups_inventory to load vars for managed-node1 12755 1727204078.15258: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204078.15275: Calling all_plugins_play to load vars for managed-node1 12755 1727204078.15279: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204078.15283: Calling groups_plugins_play to load vars for managed-node1 12755 1727204078.15673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204078.15982: done with get_vars() 12755 1727204078.15996: done getting variables 12755 1727204078.16083: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.028) 0:00:03.396 ***** 12755 1727204078.16119: entering _queue_task() for managed-node1/command 12755 1727204078.16523: worker is 1 (out of 1 available) 12755 1727204078.16536: exiting _queue_task() for managed-node1/command 12755 1727204078.16550: done queuing things up, now waiting for results queue to drain 12755 1727204078.16551: waiting for pending results... 12755 1727204078.16832: running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 12755 1727204078.16882: in run() - task 12b410aa-8751-72e9-1a19-0000000001ef 12755 1727204078.16931: variable 'ansible_search_path' from source: unknown 12755 1727204078.16935: variable 'ansible_search_path' from source: unknown 12755 1727204078.16966: calling self._execute() 12755 1727204078.17094: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204078.17098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204078.17101: variable 'omit' from source: magic vars 12755 1727204078.17907: variable 'ansible_distribution' from source: facts 12755 1727204078.17912: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 12755 1727204078.17915: when evaluation is False, skipping this task 12755 1727204078.17918: _execute() done 12755 1727204078.17920: dumping result to json 12755 1727204078.17922: done dumping result, returning 12755 1727204078.17924: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 [12b410aa-8751-72e9-1a19-0000000001ef] 12755 1727204078.17927: sending task result for task 12b410aa-8751-72e9-1a19-0000000001ef skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 12755 1727204078.18620: no more pending results, returning what we have 12755 1727204078.18623: results queue empty 12755 1727204078.18624: checking for any_errors_fatal 12755 1727204078.18629: done checking for any_errors_fatal 12755 1727204078.18630: checking for max_fail_percentage 12755 1727204078.18633: done checking for max_fail_percentage 12755 1727204078.18634: checking to see if all hosts have failed and the running result is not ok 12755 1727204078.18635: done checking to see if all hosts have failed 12755 1727204078.18636: getting the remaining hosts for this loop 12755 1727204078.18637: done getting the remaining hosts for this loop 12755 1727204078.18641: getting the next task for host managed-node1 12755 1727204078.18648: done getting next task for host managed-node1 12755 1727204078.18651: ^ task is: TASK: Enable EPEL 8 12755 1727204078.18655: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204078.18660: getting variables 12755 1727204078.18662: in VariableManager get_vars() 12755 1727204078.18691: Calling all_inventory to load vars for managed-node1 12755 1727204078.18694: Calling groups_inventory to load vars for managed-node1 12755 1727204078.18698: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204078.18705: done sending task result for task 12b410aa-8751-72e9-1a19-0000000001ef 12755 1727204078.18716: Calling all_plugins_play to load vars for managed-node1 12755 1727204078.18720: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204078.18725: Calling groups_plugins_play to load vars for managed-node1 12755 1727204078.19187: WORKER PROCESS EXITING 12755 1727204078.19211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204078.19506: done with get_vars() 12755 1727204078.19517: done getting variables 12755 1727204078.19592: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.035) 0:00:03.431 ***** 12755 1727204078.19627: entering _queue_task() for managed-node1/command 12755 1727204078.20299: worker is 1 (out of 1 available) 12755 1727204078.20309: exiting _queue_task() for managed-node1/command 12755 1727204078.20320: done queuing things up, now waiting for results queue to drain 12755 1727204078.20322: waiting for pending results... 12755 1727204078.20502: running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 12755 1727204078.20868: in run() - task 12b410aa-8751-72e9-1a19-0000000001f0 12755 1727204078.20893: variable 'ansible_search_path' from source: unknown 12755 1727204078.20987: variable 'ansible_search_path' from source: unknown 12755 1727204078.21087: calling self._execute() 12755 1727204078.21597: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204078.21603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204078.21606: variable 'omit' from source: magic vars 12755 1727204078.22520: variable 'ansible_distribution' from source: facts 12755 1727204078.22689: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 12755 1727204078.22695: when evaluation is False, skipping this task 12755 1727204078.22698: _execute() done 12755 1727204078.22701: dumping result to json 12755 1727204078.22703: done dumping result, returning 12755 1727204078.22706: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 [12b410aa-8751-72e9-1a19-0000000001f0] 12755 1727204078.22708: sending task result for task 12b410aa-8751-72e9-1a19-0000000001f0 12755 1727204078.22782: done sending task result for task 12b410aa-8751-72e9-1a19-0000000001f0 12755 1727204078.22786: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 12755 1727204078.22849: no more pending results, returning what we have 12755 1727204078.22853: results queue empty 12755 1727204078.22854: checking for any_errors_fatal 12755 1727204078.22860: done checking for any_errors_fatal 12755 1727204078.22861: checking for max_fail_percentage 12755 1727204078.22863: done checking for max_fail_percentage 12755 1727204078.22864: checking to see if all hosts have failed and the running result is not ok 12755 1727204078.22866: done checking to see if all hosts have failed 12755 1727204078.22867: getting the remaining hosts for this loop 12755 1727204078.22868: done getting the remaining hosts for this loop 12755 1727204078.22874: getting the next task for host managed-node1 12755 1727204078.22887: done getting next task for host managed-node1 12755 1727204078.22893: ^ task is: TASK: Enable EPEL 6 12755 1727204078.22898: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204078.22908: getting variables 12755 1727204078.22911: in VariableManager get_vars() 12755 1727204078.22948: Calling all_inventory to load vars for managed-node1 12755 1727204078.22952: Calling groups_inventory to load vars for managed-node1 12755 1727204078.22957: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204078.22974: Calling all_plugins_play to load vars for managed-node1 12755 1727204078.22978: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204078.22982: Calling groups_plugins_play to load vars for managed-node1 12755 1727204078.24027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204078.24488: done with get_vars() 12755 1727204078.24504: done getting variables 12755 1727204078.24572: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.052) 0:00:03.484 ***** 12755 1727204078.24896: entering _queue_task() for managed-node1/copy 12755 1727204078.25559: worker is 1 (out of 1 available) 12755 1727204078.25574: exiting _queue_task() for managed-node1/copy 12755 1727204078.25588: done queuing things up, now waiting for results queue to drain 12755 1727204078.25687: waiting for pending results... 12755 1727204078.26307: running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 12755 1727204078.26496: in run() - task 12b410aa-8751-72e9-1a19-0000000001f2 12755 1727204078.26501: variable 'ansible_search_path' from source: unknown 12755 1727204078.26505: variable 'ansible_search_path' from source: unknown 12755 1727204078.26508: calling self._execute() 12755 1727204078.26511: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204078.26514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204078.26895: variable 'omit' from source: magic vars 12755 1727204078.27575: variable 'ansible_distribution' from source: facts 12755 1727204078.27598: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 12755 1727204078.27895: when evaluation is False, skipping this task 12755 1727204078.27899: _execute() done 12755 1727204078.27901: dumping result to json 12755 1727204078.27904: done dumping result, returning 12755 1727204078.27906: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 [12b410aa-8751-72e9-1a19-0000000001f2] 12755 1727204078.27908: sending task result for task 12b410aa-8751-72e9-1a19-0000000001f2 12755 1727204078.27992: done sending task result for task 12b410aa-8751-72e9-1a19-0000000001f2 12755 1727204078.27997: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 12755 1727204078.28058: no more pending results, returning what we have 12755 1727204078.28062: results queue empty 12755 1727204078.28063: checking for any_errors_fatal 12755 1727204078.28068: done checking for any_errors_fatal 12755 1727204078.28069: checking for max_fail_percentage 12755 1727204078.28071: done checking for max_fail_percentage 12755 1727204078.28072: checking to see if all hosts have failed and the running result is not ok 12755 1727204078.28073: done checking to see if all hosts have failed 12755 1727204078.28074: getting the remaining hosts for this loop 12755 1727204078.28076: done getting the remaining hosts for this loop 12755 1727204078.28080: getting the next task for host managed-node1 12755 1727204078.28093: done getting next task for host managed-node1 12755 1727204078.28097: ^ task is: TASK: Set network provider to 'nm' 12755 1727204078.28100: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204078.28104: getting variables 12755 1727204078.28106: in VariableManager get_vars() 12755 1727204078.28143: Calling all_inventory to load vars for managed-node1 12755 1727204078.28147: Calling groups_inventory to load vars for managed-node1 12755 1727204078.28152: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204078.28167: Calling all_plugins_play to load vars for managed-node1 12755 1727204078.28170: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204078.28174: Calling groups_plugins_play to load vars for managed-node1 12755 1727204078.28815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204078.29442: done with get_vars() 12755 1727204078.29454: done getting variables 12755 1727204078.29681: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:13 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.048) 0:00:03.533 ***** 12755 1727204078.29765: entering _queue_task() for managed-node1/set_fact 12755 1727204078.30427: worker is 1 (out of 1 available) 12755 1727204078.30440: exiting _queue_task() for managed-node1/set_fact 12755 1727204078.30453: done queuing things up, now waiting for results queue to drain 12755 1727204078.30455: waiting for pending results... 12755 1727204078.31107: running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' 12755 1727204078.31111: in run() - task 12b410aa-8751-72e9-1a19-000000000007 12755 1727204078.31114: variable 'ansible_search_path' from source: unknown 12755 1727204078.31496: calling self._execute() 12755 1727204078.31500: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204078.31503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204078.31506: variable 'omit' from source: magic vars 12755 1727204078.31751: variable 'omit' from source: magic vars 12755 1727204078.31803: variable 'omit' from source: magic vars 12755 1727204078.31856: variable 'omit' from source: magic vars 12755 1727204078.32149: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204078.32201: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204078.32235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204078.32259: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204078.32279: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204078.32323: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204078.32695: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204078.32698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204078.32701: Set connection var ansible_connection to ssh 12755 1727204078.32704: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204078.32706: Set connection var ansible_shell_type to sh 12755 1727204078.32708: Set connection var ansible_timeout to 10 12755 1727204078.32710: Set connection var ansible_shell_executable to /bin/sh 12755 1727204078.32713: Set connection var ansible_pipelining to False 12755 1727204078.32922: variable 'ansible_shell_executable' from source: unknown 12755 1727204078.32935: variable 'ansible_connection' from source: unknown 12755 1727204078.32944: variable 'ansible_module_compression' from source: unknown 12755 1727204078.32952: variable 'ansible_shell_type' from source: unknown 12755 1727204078.32960: variable 'ansible_shell_executable' from source: unknown 12755 1727204078.32968: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204078.32977: variable 'ansible_pipelining' from source: unknown 12755 1727204078.32985: variable 'ansible_timeout' from source: unknown 12755 1727204078.32998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204078.33393: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204078.33419: variable 'omit' from source: magic vars 12755 1727204078.33432: starting attempt loop 12755 1727204078.33441: running the handler 12755 1727204078.33459: handler run complete 12755 1727204078.33476: attempt loop complete, returning result 12755 1727204078.33485: _execute() done 12755 1727204078.33496: dumping result to json 12755 1727204078.33506: done dumping result, returning 12755 1727204078.33523: done running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' [12b410aa-8751-72e9-1a19-000000000007] 12755 1727204078.33795: sending task result for task 12b410aa-8751-72e9-1a19-000000000007 12755 1727204078.33874: done sending task result for task 12b410aa-8751-72e9-1a19-000000000007 12755 1727204078.33878: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 12755 1727204078.33945: no more pending results, returning what we have 12755 1727204078.33949: results queue empty 12755 1727204078.33950: checking for any_errors_fatal 12755 1727204078.33956: done checking for any_errors_fatal 12755 1727204078.33957: checking for max_fail_percentage 12755 1727204078.33959: done checking for max_fail_percentage 12755 1727204078.33960: checking to see if all hosts have failed and the running result is not ok 12755 1727204078.33961: done checking to see if all hosts have failed 12755 1727204078.33961: getting the remaining hosts for this loop 12755 1727204078.33963: done getting the remaining hosts for this loop 12755 1727204078.33970: getting the next task for host managed-node1 12755 1727204078.33979: done getting next task for host managed-node1 12755 1727204078.33981: ^ task is: TASK: meta (flush_handlers) 12755 1727204078.33983: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204078.33990: getting variables 12755 1727204078.33993: in VariableManager get_vars() 12755 1727204078.34028: Calling all_inventory to load vars for managed-node1 12755 1727204078.34032: Calling groups_inventory to load vars for managed-node1 12755 1727204078.34037: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204078.34050: Calling all_plugins_play to load vars for managed-node1 12755 1727204078.34054: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204078.34058: Calling groups_plugins_play to load vars for managed-node1 12755 1727204078.34605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204078.35228: done with get_vars() 12755 1727204078.35242: done getting variables 12755 1727204078.35441: in VariableManager get_vars() 12755 1727204078.35455: Calling all_inventory to load vars for managed-node1 12755 1727204078.35458: Calling groups_inventory to load vars for managed-node1 12755 1727204078.35461: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204078.35466: Calling all_plugins_play to load vars for managed-node1 12755 1727204078.35470: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204078.35473: Calling groups_plugins_play to load vars for managed-node1 12755 1727204078.36001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204078.36672: done with get_vars() 12755 1727204078.36695: done queuing things up, now waiting for results queue to drain 12755 1727204078.36698: results queue empty 12755 1727204078.36699: checking for any_errors_fatal 12755 1727204078.36703: done checking for any_errors_fatal 12755 1727204078.36704: checking for max_fail_percentage 12755 1727204078.36706: done checking for max_fail_percentage 12755 1727204078.36707: checking to see if all hosts have failed and the running result is not ok 12755 1727204078.36708: done checking to see if all hosts have failed 12755 1727204078.36709: getting the remaining hosts for this loop 12755 1727204078.36710: done getting the remaining hosts for this loop 12755 1727204078.36714: getting the next task for host managed-node1 12755 1727204078.36781: done getting next task for host managed-node1 12755 1727204078.36784: ^ task is: TASK: meta (flush_handlers) 12755 1727204078.36786: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204078.36798: getting variables 12755 1727204078.36800: in VariableManager get_vars() 12755 1727204078.36812: Calling all_inventory to load vars for managed-node1 12755 1727204078.36815: Calling groups_inventory to load vars for managed-node1 12755 1727204078.36817: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204078.36824: Calling all_plugins_play to load vars for managed-node1 12755 1727204078.36889: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204078.36897: Calling groups_plugins_play to load vars for managed-node1 12755 1727204078.37259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204078.37986: done with get_vars() 12755 1727204078.37998: done getting variables 12755 1727204078.38120: in VariableManager get_vars() 12755 1727204078.38131: Calling all_inventory to load vars for managed-node1 12755 1727204078.38134: Calling groups_inventory to load vars for managed-node1 12755 1727204078.38193: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204078.38201: Calling all_plugins_play to load vars for managed-node1 12755 1727204078.38205: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204078.38209: Calling groups_plugins_play to load vars for managed-node1 12755 1727204078.38592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204078.39294: done with get_vars() 12755 1727204078.39315: done queuing things up, now waiting for results queue to drain 12755 1727204078.39318: results queue empty 12755 1727204078.39319: checking for any_errors_fatal 12755 1727204078.39321: done checking for any_errors_fatal 12755 1727204078.39321: checking for max_fail_percentage 12755 1727204078.39323: done checking for max_fail_percentage 12755 1727204078.39324: checking to see if all hosts have failed and the running result is not ok 12755 1727204078.39325: done checking to see if all hosts have failed 12755 1727204078.39326: getting the remaining hosts for this loop 12755 1727204078.39327: done getting the remaining hosts for this loop 12755 1727204078.39330: getting the next task for host managed-node1 12755 1727204078.39334: done getting next task for host managed-node1 12755 1727204078.39335: ^ task is: None 12755 1727204078.39337: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204078.39339: done queuing things up, now waiting for results queue to drain 12755 1727204078.39340: results queue empty 12755 1727204078.39340: checking for any_errors_fatal 12755 1727204078.39341: done checking for any_errors_fatal 12755 1727204078.39342: checking for max_fail_percentage 12755 1727204078.39343: done checking for max_fail_percentage 12755 1727204078.39422: checking to see if all hosts have failed and the running result is not ok 12755 1727204078.39424: done checking to see if all hosts have failed 12755 1727204078.39427: getting the next task for host managed-node1 12755 1727204078.39432: done getting next task for host managed-node1 12755 1727204078.39433: ^ task is: None 12755 1727204078.39435: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204078.39503: in VariableManager get_vars() 12755 1727204078.39641: done with get_vars() 12755 1727204078.39650: in VariableManager get_vars() 12755 1727204078.39759: done with get_vars() 12755 1727204078.39766: variable 'omit' from source: magic vars 12755 1727204078.39817: in VariableManager get_vars() 12755 1727204078.39847: done with get_vars() 12755 1727204078.39973: variable 'omit' from source: magic vars PLAY [Play for testing bond removal] ******************************************* 12755 1727204078.44057: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 12755 1727204078.44222: getting the remaining hosts for this loop 12755 1727204078.44224: done getting the remaining hosts for this loop 12755 1727204078.44227: getting the next task for host managed-node1 12755 1727204078.44231: done getting next task for host managed-node1 12755 1727204078.44233: ^ task is: TASK: Gathering Facts 12755 1727204078.44235: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204078.44237: getting variables 12755 1727204078.44238: in VariableManager get_vars() 12755 1727204078.44261: Calling all_inventory to load vars for managed-node1 12755 1727204078.44264: Calling groups_inventory to load vars for managed-node1 12755 1727204078.44266: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204078.44273: Calling all_plugins_play to load vars for managed-node1 12755 1727204078.44291: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204078.44296: Calling groups_plugins_play to load vars for managed-node1 12755 1727204078.44665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204078.44957: done with get_vars() 12755 1727204078.44968: done getting variables 12755 1727204078.45034: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:3 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.152) 0:00:03.686 ***** 12755 1727204078.45065: entering _queue_task() for managed-node1/gather_facts 12755 1727204078.45546: worker is 1 (out of 1 available) 12755 1727204078.45560: exiting _queue_task() for managed-node1/gather_facts 12755 1727204078.45572: done queuing things up, now waiting for results queue to drain 12755 1727204078.45574: waiting for pending results... 12755 1727204078.45735: running TaskExecutor() for managed-node1/TASK: Gathering Facts 12755 1727204078.45908: in run() - task 12b410aa-8751-72e9-1a19-000000000218 12755 1727204078.45911: variable 'ansible_search_path' from source: unknown 12755 1727204078.45935: calling self._execute() 12755 1727204078.46059: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204078.46073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204078.46125: variable 'omit' from source: magic vars 12755 1727204078.46583: variable 'ansible_distribution_major_version' from source: facts 12755 1727204078.46606: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204078.46619: variable 'omit' from source: magic vars 12755 1727204078.46668: variable 'omit' from source: magic vars 12755 1727204078.46748: variable 'omit' from source: magic vars 12755 1727204078.46791: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204078.46856: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204078.46878: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204078.46913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204078.46965: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204078.46980: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204078.46991: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204078.47002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204078.47141: Set connection var ansible_connection to ssh 12755 1727204078.47183: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204078.47187: Set connection var ansible_shell_type to sh 12755 1727204078.47191: Set connection var ansible_timeout to 10 12755 1727204078.47203: Set connection var ansible_shell_executable to /bin/sh 12755 1727204078.47215: Set connection var ansible_pipelining to False 12755 1727204078.47250: variable 'ansible_shell_executable' from source: unknown 12755 1727204078.47292: variable 'ansible_connection' from source: unknown 12755 1727204078.47297: variable 'ansible_module_compression' from source: unknown 12755 1727204078.47299: variable 'ansible_shell_type' from source: unknown 12755 1727204078.47301: variable 'ansible_shell_executable' from source: unknown 12755 1727204078.47303: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204078.47305: variable 'ansible_pipelining' from source: unknown 12755 1727204078.47307: variable 'ansible_timeout' from source: unknown 12755 1727204078.47312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204078.47537: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204078.47596: variable 'omit' from source: magic vars 12755 1727204078.47601: starting attempt loop 12755 1727204078.47603: running the handler 12755 1727204078.47606: variable 'ansible_facts' from source: unknown 12755 1727204078.47635: _low_level_execute_command(): starting 12755 1727204078.47648: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204078.48514: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204078.48636: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204078.48726: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204078.50534: stdout chunk (state=3): >>>/root <<< 12755 1727204078.50844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204078.51146: stderr chunk (state=3): >>><<< 12755 1727204078.51150: stdout chunk (state=3): >>><<< 12755 1727204078.51155: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204078.51157: _low_level_execute_command(): starting 12755 1727204078.51160: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204078.5104575-12947-45482502358028 `" && echo ansible-tmp-1727204078.5104575-12947-45482502358028="` echo /root/.ansible/tmp/ansible-tmp-1727204078.5104575-12947-45482502358028 `" ) && sleep 0' 12755 1727204078.52348: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204078.52577: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204078.52607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204078.52694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204078.54795: stdout chunk (state=3): >>>ansible-tmp-1727204078.5104575-12947-45482502358028=/root/.ansible/tmp/ansible-tmp-1727204078.5104575-12947-45482502358028 <<< 12755 1727204078.55113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204078.55128: stdout chunk (state=3): >>><<< 12755 1727204078.55141: stderr chunk (state=3): >>><<< 12755 1727204078.55175: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204078.5104575-12947-45482502358028=/root/.ansible/tmp/ansible-tmp-1727204078.5104575-12947-45482502358028 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204078.55223: variable 'ansible_module_compression' from source: unknown 12755 1727204078.55496: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12755 1727204078.55530: variable 'ansible_facts' from source: unknown 12755 1727204078.55883: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204078.5104575-12947-45482502358028/AnsiballZ_setup.py 12755 1727204078.56401: Sending initial data 12755 1727204078.56412: Sent initial data (153 bytes) 12755 1727204078.57758: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204078.57883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204078.57902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204078.57978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204078.59926: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204078.59959: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204078.60065: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpizirz4ze /root/.ansible/tmp/ansible-tmp-1727204078.5104575-12947-45482502358028/AnsiballZ_setup.py <<< 12755 1727204078.60068: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204078.5104575-12947-45482502358028/AnsiballZ_setup.py" <<< 12755 1727204078.60110: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpizirz4ze" to remote "/root/.ansible/tmp/ansible-tmp-1727204078.5104575-12947-45482502358028/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204078.5104575-12947-45482502358028/AnsiballZ_setup.py" <<< 12755 1727204078.66833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204078.67052: stdout chunk (state=3): >>><<< 12755 1727204078.67055: stderr chunk (state=3): >>><<< 12755 1727204078.67057: done transferring module to remote 12755 1727204078.67060: _low_level_execute_command(): starting 12755 1727204078.67061: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204078.5104575-12947-45482502358028/ /root/.ansible/tmp/ansible-tmp-1727204078.5104575-12947-45482502358028/AnsiballZ_setup.py && sleep 0' 12755 1727204078.68211: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204078.68237: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12755 1727204078.68306: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204078.68411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204078.68430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204078.68506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204078.70555: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204078.70610: stderr chunk (state=3): >>><<< 12755 1727204078.70624: stdout chunk (state=3): >>><<< 12755 1727204078.70665: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204078.70703: _low_level_execute_command(): starting 12755 1727204078.70718: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204078.5104575-12947-45482502358028/AnsiballZ_setup.py && sleep 0' 12755 1727204078.71961: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204078.72105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204078.72352: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204078.72549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204079.40340: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec295f006a273ed037dc1f179c04a840", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_loadavg": {"1m": 0.52685546875, "5m": 0.36962890625, "15m": 0.1767578125}, "ansible_lsb": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPMunxA14oEo4zS2fFpjhbXGrPN+Pc6yYvB7bHw1sbQmdYiLO2rxhswFBK4xGCi3vLH7aJpii+0+X9RSsTnEBO0RK8RoPR2xkDNYXW0a1/AwnAaak2bYj0nyDuqJg37AS/oUOb1vdeV3ZYBHlx2BeYopb3qrr4hWKyEoB0Cj4GUfAAAAFQDdJ5ecc5ETygm9XhUUj7x91BVbMwAAAIEA3LP6y0wGAOTGTdHT9uHV9ZnnWPry3FD498XUhfd/8katSmv9dBqFZ5BSlmylNhNOGN/dgGvIysah3TjyiVgAhMIDSxyWXeNKylfCPrSiGgzM8sPtvUHAKjCr4YeqDBRpE2nuYpznop73ZNQ+pIgZqdMFXs4mhUw8Ai2Xc/5SU50AAACAdpxXHRObj5kiSZgGAlvwkslKIUteCSyoGibmNskfiBNzJY3St95HEK+e2xMQboTkyY3hrBqloLoBzSVdWeHcA4Dy35X8VTnKqPND6sF4ZlGbbCJ4i7j+NZNvY9YE7WlAvhx04nuXVm8WrfYDcMosawu6xUqwt2jyqxyZoW4D7vM=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDPMZg3FopBCtnsfqhDL1jxsml2miVdyRUl9+VLqXB9qIsKgdugZ5C55RjTiQ22wsa2Eh+b3yyIsvxFblaHxpQzIRnwsGaUu0F4hFCZXlaF5fc3O+m2QULrdxgV3MgQyWL48mVBiOB+GPPbs7QmzI86NB7uKRLNDd/a1TptTIakCXZG8IzEbICTS8L5pdUZ9xNLEft03pnuhGY/GBZ92mu+wYkGzptfYkjUD3tquOMvoARgvTTX7p7aXxFfSocK0lHZFLYlKMJRVt54wVcnxHlL5CKemFAnA9S+D8LZ5wZeoUJSE/kn/yvpO8XUisytKZyOmRFK/G3+cUWh6Pd7qYdOok4cofVXlyuTOw2mI5oI0U9r4iP+OK/lmhxRulzsX5W/l0DkEwkU1RuSRlvwu5f9EnfGb0i2T/oUti+RAH1DryJz8HsYqwWh73E/eQA3Syq8QjnIsYEPvoPNycnSAARw/gUeutemNV7jA6AoH96WGXVckMWfsvjlKleAA9RzgPc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDEVvjAxgU/tcZsEFgZpofCf1HGzA0uJ8gV8vzQgeRwbkH+VzA0Knn/ZxPSYccws9QCPb2xcCpDQhETAIG5RKHc=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAbPQMlsrcdV/DtPY+pi9Fcm1KJrxSB0LaYtXxUu2kxn", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "39", "epoch": "1727204079", "epoch_int": "1727204079", "date": "2024-09-24", "time": "14:54:39", "iso8601_micro": "2024-09-24T18:54:39.044225Z", "iso8601": "2024-09-24T18:54:39Z", "iso8601_basic": "20240924T145439044225", "iso8601_basic_short": "20240924T145439", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansibl<<< 12755 1727204079.40379: stdout chunk (state=3): >>>e_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_hostnqn": "", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50712 10.31.11.210 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50712 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2844, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 873, "free": 2844}, "nocache": {"free": 3470, "used": 247}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec295f00-6a27-3ed0-37dc-1f179c04a840", "ansible_product_uuid": "ec295f00-6a27-3ed0-37dc-1f179c04a840", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 569, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251156525056, "block_size": 4096, "block_total": 64479564, "block_available": 61317511, "block_used": 3162053, "inode_total": 16384000, "inode_available": 16302250, "inode_used": 81750, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "12:d4:45:6e:f8:dd", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.210", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d080:f60d:659:9515", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.210", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:d4:45:6e:f8:dd", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.210"], "ansible_all_ipv6_addresses": ["fe80::d080:f60d:659:9515"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.210", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d080:f60d:659:9515"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12755 1727204079.42645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204079.42649: stdout chunk (state=3): >>><<< 12755 1727204079.42652: stderr chunk (state=3): >>><<< 12755 1727204079.42700: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec295f006a273ed037dc1f179c04a840", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_loadavg": {"1m": 0.52685546875, "5m": 0.36962890625, "15m": 0.1767578125}, "ansible_lsb": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPMunxA14oEo4zS2fFpjhbXGrPN+Pc6yYvB7bHw1sbQmdYiLO2rxhswFBK4xGCi3vLH7aJpii+0+X9RSsTnEBO0RK8RoPR2xkDNYXW0a1/AwnAaak2bYj0nyDuqJg37AS/oUOb1vdeV3ZYBHlx2BeYopb3qrr4hWKyEoB0Cj4GUfAAAAFQDdJ5ecc5ETygm9XhUUj7x91BVbMwAAAIEA3LP6y0wGAOTGTdHT9uHV9ZnnWPry3FD498XUhfd/8katSmv9dBqFZ5BSlmylNhNOGN/dgGvIysah3TjyiVgAhMIDSxyWXeNKylfCPrSiGgzM8sPtvUHAKjCr4YeqDBRpE2nuYpznop73ZNQ+pIgZqdMFXs4mhUw8Ai2Xc/5SU50AAACAdpxXHRObj5kiSZgGAlvwkslKIUteCSyoGibmNskfiBNzJY3St95HEK+e2xMQboTkyY3hrBqloLoBzSVdWeHcA4Dy35X8VTnKqPND6sF4ZlGbbCJ4i7j+NZNvY9YE7WlAvhx04nuXVm8WrfYDcMosawu6xUqwt2jyqxyZoW4D7vM=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDPMZg3FopBCtnsfqhDL1jxsml2miVdyRUl9+VLqXB9qIsKgdugZ5C55RjTiQ22wsa2Eh+b3yyIsvxFblaHxpQzIRnwsGaUu0F4hFCZXlaF5fc3O+m2QULrdxgV3MgQyWL48mVBiOB+GPPbs7QmzI86NB7uKRLNDd/a1TptTIakCXZG8IzEbICTS8L5pdUZ9xNLEft03pnuhGY/GBZ92mu+wYkGzptfYkjUD3tquOMvoARgvTTX7p7aXxFfSocK0lHZFLYlKMJRVt54wVcnxHlL5CKemFAnA9S+D8LZ5wZeoUJSE/kn/yvpO8XUisytKZyOmRFK/G3+cUWh6Pd7qYdOok4cofVXlyuTOw2mI5oI0U9r4iP+OK/lmhxRulzsX5W/l0DkEwkU1RuSRlvwu5f9EnfGb0i2T/oUti+RAH1DryJz8HsYqwWh73E/eQA3Syq8QjnIsYEPvoPNycnSAARw/gUeutemNV7jA6AoH96WGXVckMWfsvjlKleAA9RzgPc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDEVvjAxgU/tcZsEFgZpofCf1HGzA0uJ8gV8vzQgeRwbkH+VzA0Knn/ZxPSYccws9QCPb2xcCpDQhETAIG5RKHc=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAbPQMlsrcdV/DtPY+pi9Fcm1KJrxSB0LaYtXxUu2kxn", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "39", "epoch": "1727204079", "epoch_int": "1727204079", "date": "2024-09-24", "time": "14:54:39", "iso8601_micro": "2024-09-24T18:54:39.044225Z", "iso8601": "2024-09-24T18:54:39Z", "iso8601_basic": "20240924T145439044225", "iso8601_basic_short": "20240924T145439", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_hostnqn": "", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50712 10.31.11.210 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50712 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2844, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 873, "free": 2844}, "nocache": {"free": 3470, "used": 247}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec295f00-6a27-3ed0-37dc-1f179c04a840", "ansible_product_uuid": "ec295f00-6a27-3ed0-37dc-1f179c04a840", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 569, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251156525056, "block_size": 4096, "block_total": 64479564, "block_available": 61317511, "block_used": 3162053, "inode_total": 16384000, "inode_available": 16302250, "inode_used": 81750, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "12:d4:45:6e:f8:dd", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.210", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d080:f60d:659:9515", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.210", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:d4:45:6e:f8:dd", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.210"], "ansible_all_ipv6_addresses": ["fe80::d080:f60d:659:9515"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.210", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d080:f60d:659:9515"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204079.43407: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204078.5104575-12947-45482502358028/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204079.43495: _low_level_execute_command(): starting 12755 1727204079.43499: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204078.5104575-12947-45482502358028/ > /dev/null 2>&1 && sleep 0' 12755 1727204079.44134: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204079.44149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204079.44185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204079.44301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 12755 1727204079.44322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204079.44342: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204079.44429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204079.46407: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204079.46490: stderr chunk (state=3): >>><<< 12755 1727204079.46505: stdout chunk (state=3): >>><<< 12755 1727204079.46541: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204079.46557: handler run complete 12755 1727204079.46901: variable 'ansible_facts' from source: unknown 12755 1727204079.46905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204079.47406: variable 'ansible_facts' from source: unknown 12755 1727204079.47536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204079.47827: attempt loop complete, returning result 12755 1727204079.47885: _execute() done 12755 1727204079.47992: dumping result to json 12755 1727204079.48037: done dumping result, returning 12755 1727204079.48052: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [12b410aa-8751-72e9-1a19-000000000218] 12755 1727204079.48061: sending task result for task 12b410aa-8751-72e9-1a19-000000000218 ok: [managed-node1] 12755 1727204079.49149: no more pending results, returning what we have 12755 1727204079.49153: results queue empty 12755 1727204079.49154: checking for any_errors_fatal 12755 1727204079.49156: done checking for any_errors_fatal 12755 1727204079.49157: checking for max_fail_percentage 12755 1727204079.49159: done checking for max_fail_percentage 12755 1727204079.49160: checking to see if all hosts have failed and the running result is not ok 12755 1727204079.49161: done checking to see if all hosts have failed 12755 1727204079.49162: getting the remaining hosts for this loop 12755 1727204079.49163: done getting the remaining hosts for this loop 12755 1727204079.49167: getting the next task for host managed-node1 12755 1727204079.49176: done sending task result for task 12b410aa-8751-72e9-1a19-000000000218 12755 1727204079.49180: WORKER PROCESS EXITING 12755 1727204079.49186: done getting next task for host managed-node1 12755 1727204079.49188: ^ task is: TASK: meta (flush_handlers) 12755 1727204079.49192: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204079.49197: getting variables 12755 1727204079.49199: in VariableManager get_vars() 12755 1727204079.49256: Calling all_inventory to load vars for managed-node1 12755 1727204079.49260: Calling groups_inventory to load vars for managed-node1 12755 1727204079.49263: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204079.49275: Calling all_plugins_play to load vars for managed-node1 12755 1727204079.49279: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204079.49283: Calling groups_plugins_play to load vars for managed-node1 12755 1727204079.49515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204079.49839: done with get_vars() 12755 1727204079.49851: done getting variables 12755 1727204079.49944: in VariableManager get_vars() 12755 1727204079.49971: Calling all_inventory to load vars for managed-node1 12755 1727204079.49974: Calling groups_inventory to load vars for managed-node1 12755 1727204079.49977: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204079.49988: Calling all_plugins_play to load vars for managed-node1 12755 1727204079.49991: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204079.49997: Calling groups_plugins_play to load vars for managed-node1 12755 1727204079.50194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204079.50493: done with get_vars() 12755 1727204079.50510: done queuing things up, now waiting for results queue to drain 12755 1727204079.50513: results queue empty 12755 1727204079.50514: checking for any_errors_fatal 12755 1727204079.50521: done checking for any_errors_fatal 12755 1727204079.50522: checking for max_fail_percentage 12755 1727204079.50523: done checking for max_fail_percentage 12755 1727204079.50524: checking to see if all hosts have failed and the running result is not ok 12755 1727204079.50533: done checking to see if all hosts have failed 12755 1727204079.50535: getting the remaining hosts for this loop 12755 1727204079.50536: done getting the remaining hosts for this loop 12755 1727204079.50539: getting the next task for host managed-node1 12755 1727204079.50544: done getting next task for host managed-node1 12755 1727204079.50546: ^ task is: TASK: INIT Prepare setup 12755 1727204079.50548: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204079.50550: getting variables 12755 1727204079.50551: in VariableManager get_vars() 12755 1727204079.50575: Calling all_inventory to load vars for managed-node1 12755 1727204079.50578: Calling groups_inventory to load vars for managed-node1 12755 1727204079.50580: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204079.50586: Calling all_plugins_play to load vars for managed-node1 12755 1727204079.50591: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204079.50596: Calling groups_plugins_play to load vars for managed-node1 12755 1727204079.50793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204079.51082: done with get_vars() 12755 1727204079.51094: done getting variables 12755 1727204079.51195: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:15 Tuesday 24 September 2024 14:54:39 -0400 (0:00:01.061) 0:00:04.747 ***** 12755 1727204079.51228: entering _queue_task() for managed-node1/debug 12755 1727204079.51230: Creating lock for debug 12755 1727204079.51642: worker is 1 (out of 1 available) 12755 1727204079.51655: exiting _queue_task() for managed-node1/debug 12755 1727204079.51667: done queuing things up, now waiting for results queue to drain 12755 1727204079.51669: waiting for pending results... 12755 1727204079.51882: running TaskExecutor() for managed-node1/TASK: INIT Prepare setup 12755 1727204079.52001: in run() - task 12b410aa-8751-72e9-1a19-00000000000b 12755 1727204079.52058: variable 'ansible_search_path' from source: unknown 12755 1727204079.52076: calling self._execute() 12755 1727204079.52185: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204079.52203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204079.52224: variable 'omit' from source: magic vars 12755 1727204079.52817: variable 'ansible_distribution_major_version' from source: facts 12755 1727204079.52826: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204079.52829: variable 'omit' from source: magic vars 12755 1727204079.52852: variable 'omit' from source: magic vars 12755 1727204079.52907: variable 'omit' from source: magic vars 12755 1727204079.52995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204079.53021: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204079.53059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204079.53086: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204079.53140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204079.53159: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204079.53169: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204079.53179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204079.53324: Set connection var ansible_connection to ssh 12755 1727204079.53339: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204079.53357: Set connection var ansible_shell_type to sh 12755 1727204079.53395: Set connection var ansible_timeout to 10 12755 1727204079.53399: Set connection var ansible_shell_executable to /bin/sh 12755 1727204079.53403: Set connection var ansible_pipelining to False 12755 1727204079.53436: variable 'ansible_shell_executable' from source: unknown 12755 1727204079.53465: variable 'ansible_connection' from source: unknown 12755 1727204079.53468: variable 'ansible_module_compression' from source: unknown 12755 1727204079.53475: variable 'ansible_shell_type' from source: unknown 12755 1727204079.53478: variable 'ansible_shell_executable' from source: unknown 12755 1727204079.53481: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204079.53587: variable 'ansible_pipelining' from source: unknown 12755 1727204079.53593: variable 'ansible_timeout' from source: unknown 12755 1727204079.53596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204079.53693: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204079.53725: variable 'omit' from source: magic vars 12755 1727204079.53736: starting attempt loop 12755 1727204079.53744: running the handler 12755 1727204079.53811: handler run complete 12755 1727204079.53895: attempt loop complete, returning result 12755 1727204079.53898: _execute() done 12755 1727204079.53900: dumping result to json 12755 1727204079.53902: done dumping result, returning 12755 1727204079.53905: done running TaskExecutor() for managed-node1/TASK: INIT Prepare setup [12b410aa-8751-72e9-1a19-00000000000b] 12755 1727204079.53907: sending task result for task 12b410aa-8751-72e9-1a19-00000000000b ok: [managed-node1] => {} MSG: ################################################## 12755 1727204079.54096: no more pending results, returning what we have 12755 1727204079.54099: results queue empty 12755 1727204079.54101: checking for any_errors_fatal 12755 1727204079.54103: done checking for any_errors_fatal 12755 1727204079.54104: checking for max_fail_percentage 12755 1727204079.54106: done checking for max_fail_percentage 12755 1727204079.54107: checking to see if all hosts have failed and the running result is not ok 12755 1727204079.54108: done checking to see if all hosts have failed 12755 1727204079.54109: getting the remaining hosts for this loop 12755 1727204079.54111: done getting the remaining hosts for this loop 12755 1727204079.54118: getting the next task for host managed-node1 12755 1727204079.54127: done getting next task for host managed-node1 12755 1727204079.54131: ^ task is: TASK: Install dnsmasq 12755 1727204079.54134: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204079.54140: getting variables 12755 1727204079.54142: in VariableManager get_vars() 12755 1727204079.54275: Calling all_inventory to load vars for managed-node1 12755 1727204079.54279: Calling groups_inventory to load vars for managed-node1 12755 1727204079.54282: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204079.54509: done sending task result for task 12b410aa-8751-72e9-1a19-00000000000b 12755 1727204079.54512: WORKER PROCESS EXITING 12755 1727204079.54525: Calling all_plugins_play to load vars for managed-node1 12755 1727204079.54529: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204079.54534: Calling groups_plugins_play to load vars for managed-node1 12755 1727204079.54773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204079.55055: done with get_vars() 12755 1727204079.55069: done getting variables 12755 1727204079.55145: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.039) 0:00:04.787 ***** 12755 1727204079.55188: entering _queue_task() for managed-node1/package 12755 1727204079.55627: worker is 1 (out of 1 available) 12755 1727204079.55639: exiting _queue_task() for managed-node1/package 12755 1727204079.55652: done queuing things up, now waiting for results queue to drain 12755 1727204079.55654: waiting for pending results... 12755 1727204079.55858: running TaskExecutor() for managed-node1/TASK: Install dnsmasq 12755 1727204079.56037: in run() - task 12b410aa-8751-72e9-1a19-00000000000f 12755 1727204079.56060: variable 'ansible_search_path' from source: unknown 12755 1727204079.56069: variable 'ansible_search_path' from source: unknown 12755 1727204079.56127: calling self._execute() 12755 1727204079.56241: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204079.56264: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204079.56282: variable 'omit' from source: magic vars 12755 1727204079.56763: variable 'ansible_distribution_major_version' from source: facts 12755 1727204079.56784: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204079.56806: variable 'omit' from source: magic vars 12755 1727204079.56874: variable 'omit' from source: magic vars 12755 1727204079.57151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204079.59783: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204079.59883: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204079.59937: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204079.60071: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204079.60075: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204079.60157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204079.60205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204079.60246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204079.60312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204079.60338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204079.60469: variable '__network_is_ostree' from source: set_fact 12755 1727204079.60480: variable 'omit' from source: magic vars 12755 1727204079.60531: variable 'omit' from source: magic vars 12755 1727204079.60566: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204079.60618: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204079.60795: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204079.60799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204079.60801: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204079.60804: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204079.60807: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204079.60809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204079.60866: Set connection var ansible_connection to ssh 12755 1727204079.60879: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204079.60887: Set connection var ansible_shell_type to sh 12755 1727204079.60909: Set connection var ansible_timeout to 10 12755 1727204079.60930: Set connection var ansible_shell_executable to /bin/sh 12755 1727204079.60942: Set connection var ansible_pipelining to False 12755 1727204079.60974: variable 'ansible_shell_executable' from source: unknown 12755 1727204079.60982: variable 'ansible_connection' from source: unknown 12755 1727204079.60997: variable 'ansible_module_compression' from source: unknown 12755 1727204079.61006: variable 'ansible_shell_type' from source: unknown 12755 1727204079.61013: variable 'ansible_shell_executable' from source: unknown 12755 1727204079.61024: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204079.61041: variable 'ansible_pipelining' from source: unknown 12755 1727204079.61049: variable 'ansible_timeout' from source: unknown 12755 1727204079.61059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204079.61194: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204079.61214: variable 'omit' from source: magic vars 12755 1727204079.61254: starting attempt loop 12755 1727204079.61257: running the handler 12755 1727204079.61260: variable 'ansible_facts' from source: unknown 12755 1727204079.61262: variable 'ansible_facts' from source: unknown 12755 1727204079.61302: _low_level_execute_command(): starting 12755 1727204079.61362: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204079.62069: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204079.62109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204079.62214: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204079.62231: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204079.62320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204079.64054: stdout chunk (state=3): >>>/root <<< 12755 1727204079.64179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204079.64266: stderr chunk (state=3): >>><<< 12755 1727204079.64282: stdout chunk (state=3): >>><<< 12755 1727204079.64318: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204079.64346: _low_level_execute_command(): starting 12755 1727204079.64367: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204079.6433268-13043-129841426042074 `" && echo ansible-tmp-1727204079.6433268-13043-129841426042074="` echo /root/.ansible/tmp/ansible-tmp-1727204079.6433268-13043-129841426042074 `" ) && sleep 0' 12755 1727204079.65180: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204079.65213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204079.65300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204079.67308: stdout chunk (state=3): >>>ansible-tmp-1727204079.6433268-13043-129841426042074=/root/.ansible/tmp/ansible-tmp-1727204079.6433268-13043-129841426042074 <<< 12755 1727204079.67527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204079.67531: stdout chunk (state=3): >>><<< 12755 1727204079.67534: stderr chunk (state=3): >>><<< 12755 1727204079.67695: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204079.6433268-13043-129841426042074=/root/.ansible/tmp/ansible-tmp-1727204079.6433268-13043-129841426042074 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204079.67699: variable 'ansible_module_compression' from source: unknown 12755 1727204079.67704: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 12755 1727204079.67707: ANSIBALLZ: Acquiring lock 12755 1727204079.67709: ANSIBALLZ: Lock acquired: 139630693732560 12755 1727204079.67711: ANSIBALLZ: Creating module 12755 1727204079.90700: ANSIBALLZ: Writing module into payload 12755 1727204079.90984: ANSIBALLZ: Writing module 12755 1727204079.91026: ANSIBALLZ: Renaming module 12755 1727204079.91040: ANSIBALLZ: Done creating module 12755 1727204079.91066: variable 'ansible_facts' from source: unknown 12755 1727204079.91176: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204079.6433268-13043-129841426042074/AnsiballZ_dnf.py 12755 1727204079.91456: Sending initial data 12755 1727204079.91460: Sent initial data (152 bytes) 12755 1727204079.92011: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204079.92109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204079.92149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204079.92165: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204079.92197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204079.92325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204079.94059: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204079.94129: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204079.94170: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpo630cn4a /root/.ansible/tmp/ansible-tmp-1727204079.6433268-13043-129841426042074/AnsiballZ_dnf.py <<< 12755 1727204079.94174: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204079.6433268-13043-129841426042074/AnsiballZ_dnf.py" <<< 12755 1727204079.94221: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpo630cn4a" to remote "/root/.ansible/tmp/ansible-tmp-1727204079.6433268-13043-129841426042074/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204079.6433268-13043-129841426042074/AnsiballZ_dnf.py" <<< 12755 1727204079.95759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204079.95908: stderr chunk (state=3): >>><<< 12755 1727204079.95912: stdout chunk (state=3): >>><<< 12755 1727204079.95961: done transferring module to remote 12755 1727204079.95966: _low_level_execute_command(): starting 12755 1727204079.95983: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204079.6433268-13043-129841426042074/ /root/.ansible/tmp/ansible-tmp-1727204079.6433268-13043-129841426042074/AnsiballZ_dnf.py && sleep 0' 12755 1727204079.96673: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204079.96694: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204079.96709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204079.96743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204079.96949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204079.96953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204079.97094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204079.97099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204079.99078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204079.99295: stderr chunk (state=3): >>><<< 12755 1727204079.99299: stdout chunk (state=3): >>><<< 12755 1727204079.99301: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204079.99304: _low_level_execute_command(): starting 12755 1727204079.99306: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204079.6433268-13043-129841426042074/AnsiballZ_dnf.py && sleep 0' 12755 1727204080.00503: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 12755 1727204080.00519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204080.00538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204080.00676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204081.51951: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 12755 1727204081.57142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204081.57160: stdout chunk (state=3): >>><<< 12755 1727204081.57181: stderr chunk (state=3): >>><<< 12755 1727204081.57212: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204081.57285: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204079.6433268-13043-129841426042074/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204081.57312: _low_level_execute_command(): starting 12755 1727204081.57324: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204079.6433268-13043-129841426042074/ > /dev/null 2>&1 && sleep 0' 12755 1727204081.58020: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204081.58038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204081.58096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204081.58114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204081.58128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204081.58224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204081.58248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204081.58330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204081.60362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204081.60380: stderr chunk (state=3): >>><<< 12755 1727204081.60392: stdout chunk (state=3): >>><<< 12755 1727204081.60416: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204081.60432: handler run complete 12755 1727204081.60669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204081.60995: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204081.61119: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204081.61163: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204081.61201: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204081.61336: variable '__install_status' from source: unknown 12755 1727204081.61366: Evaluated conditional (__install_status is success): True 12755 1727204081.61397: attempt loop complete, returning result 12755 1727204081.61406: _execute() done 12755 1727204081.61414: dumping result to json 12755 1727204081.61430: done dumping result, returning 12755 1727204081.61446: done running TaskExecutor() for managed-node1/TASK: Install dnsmasq [12b410aa-8751-72e9-1a19-00000000000f] 12755 1727204081.61456: sending task result for task 12b410aa-8751-72e9-1a19-00000000000f ok: [managed-node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 12755 1727204081.61729: no more pending results, returning what we have 12755 1727204081.61733: results queue empty 12755 1727204081.61735: checking for any_errors_fatal 12755 1727204081.61742: done checking for any_errors_fatal 12755 1727204081.61743: checking for max_fail_percentage 12755 1727204081.61745: done checking for max_fail_percentage 12755 1727204081.61746: checking to see if all hosts have failed and the running result is not ok 12755 1727204081.61747: done checking to see if all hosts have failed 12755 1727204081.61748: getting the remaining hosts for this loop 12755 1727204081.61749: done getting the remaining hosts for this loop 12755 1727204081.61754: getting the next task for host managed-node1 12755 1727204081.61761: done getting next task for host managed-node1 12755 1727204081.61764: ^ task is: TASK: Install pgrep, sysctl 12755 1727204081.61982: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204081.61987: getting variables 12755 1727204081.61990: in VariableManager get_vars() 12755 1727204081.62048: Calling all_inventory to load vars for managed-node1 12755 1727204081.62052: Calling groups_inventory to load vars for managed-node1 12755 1727204081.62055: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204081.62062: done sending task result for task 12b410aa-8751-72e9-1a19-00000000000f 12755 1727204081.62065: WORKER PROCESS EXITING 12755 1727204081.62076: Calling all_plugins_play to load vars for managed-node1 12755 1727204081.62080: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204081.62084: Calling groups_plugins_play to load vars for managed-node1 12755 1727204081.62341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204081.62643: done with get_vars() 12755 1727204081.62657: done getting variables 12755 1727204081.62734: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Tuesday 24 September 2024 14:54:41 -0400 (0:00:02.075) 0:00:06.863 ***** 12755 1727204081.62770: entering _queue_task() for managed-node1/package 12755 1727204081.63052: worker is 1 (out of 1 available) 12755 1727204081.63067: exiting _queue_task() for managed-node1/package 12755 1727204081.63080: done queuing things up, now waiting for results queue to drain 12755 1727204081.63082: waiting for pending results... 12755 1727204081.63364: running TaskExecutor() for managed-node1/TASK: Install pgrep, sysctl 12755 1727204081.63697: in run() - task 12b410aa-8751-72e9-1a19-000000000010 12755 1727204081.63701: variable 'ansible_search_path' from source: unknown 12755 1727204081.63703: variable 'ansible_search_path' from source: unknown 12755 1727204081.63707: calling self._execute() 12755 1727204081.63710: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204081.63713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204081.63724: variable 'omit' from source: magic vars 12755 1727204081.64184: variable 'ansible_distribution_major_version' from source: facts 12755 1727204081.64207: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204081.64373: variable 'ansible_os_family' from source: facts 12755 1727204081.64387: Evaluated conditional (ansible_os_family == 'RedHat'): True 12755 1727204081.64622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204081.64951: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204081.65018: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204081.65063: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204081.65111: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204081.65209: variable 'ansible_distribution_major_version' from source: facts 12755 1727204081.65234: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 12755 1727204081.65242: when evaluation is False, skipping this task 12755 1727204081.65249: _execute() done 12755 1727204081.65256: dumping result to json 12755 1727204081.65264: done dumping result, returning 12755 1727204081.65274: done running TaskExecutor() for managed-node1/TASK: Install pgrep, sysctl [12b410aa-8751-72e9-1a19-000000000010] 12755 1727204081.65283: sending task result for task 12b410aa-8751-72e9-1a19-000000000010 12755 1727204081.65595: done sending task result for task 12b410aa-8751-72e9-1a19-000000000010 12755 1727204081.65599: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 12755 1727204081.65642: no more pending results, returning what we have 12755 1727204081.65646: results queue empty 12755 1727204081.65647: checking for any_errors_fatal 12755 1727204081.65654: done checking for any_errors_fatal 12755 1727204081.65655: checking for max_fail_percentage 12755 1727204081.65657: done checking for max_fail_percentage 12755 1727204081.65658: checking to see if all hosts have failed and the running result is not ok 12755 1727204081.65659: done checking to see if all hosts have failed 12755 1727204081.65660: getting the remaining hosts for this loop 12755 1727204081.65662: done getting the remaining hosts for this loop 12755 1727204081.65665: getting the next task for host managed-node1 12755 1727204081.65671: done getting next task for host managed-node1 12755 1727204081.65673: ^ task is: TASK: Install pgrep, sysctl 12755 1727204081.65677: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204081.65680: getting variables 12755 1727204081.65682: in VariableManager get_vars() 12755 1727204081.65734: Calling all_inventory to load vars for managed-node1 12755 1727204081.65737: Calling groups_inventory to load vars for managed-node1 12755 1727204081.65741: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204081.65751: Calling all_plugins_play to load vars for managed-node1 12755 1727204081.65754: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204081.65758: Calling groups_plugins_play to load vars for managed-node1 12755 1727204081.66003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204081.66295: done with get_vars() 12755 1727204081.66307: done getting variables 12755 1727204081.66376: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.036) 0:00:06.899 ***** 12755 1727204081.66412: entering _queue_task() for managed-node1/package 12755 1727204081.66657: worker is 1 (out of 1 available) 12755 1727204081.66671: exiting _queue_task() for managed-node1/package 12755 1727204081.66684: done queuing things up, now waiting for results queue to drain 12755 1727204081.66685: waiting for pending results... 12755 1727204081.66944: running TaskExecutor() for managed-node1/TASK: Install pgrep, sysctl 12755 1727204081.67086: in run() - task 12b410aa-8751-72e9-1a19-000000000011 12755 1727204081.67115: variable 'ansible_search_path' from source: unknown 12755 1727204081.67128: variable 'ansible_search_path' from source: unknown 12755 1727204081.67171: calling self._execute() 12755 1727204081.67272: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204081.67284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204081.67303: variable 'omit' from source: magic vars 12755 1727204081.67736: variable 'ansible_distribution_major_version' from source: facts 12755 1727204081.67760: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204081.67969: variable 'ansible_os_family' from source: facts 12755 1727204081.67972: Evaluated conditional (ansible_os_family == 'RedHat'): True 12755 1727204081.68145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204081.68533: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204081.68588: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204081.68642: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204081.68682: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204081.68774: variable 'ansible_distribution_major_version' from source: facts 12755 1727204081.68796: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 12755 1727204081.68808: variable 'omit' from source: magic vars 12755 1727204081.69094: variable 'omit' from source: magic vars 12755 1727204081.69098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204081.71486: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204081.71582: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204081.71636: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204081.71677: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204081.71713: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204081.71834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204081.71874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204081.71913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204081.71976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204081.71999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204081.72125: variable '__network_is_ostree' from source: set_fact 12755 1727204081.72135: variable 'omit' from source: magic vars 12755 1727204081.72172: variable 'omit' from source: magic vars 12755 1727204081.72205: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204081.72245: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204081.72276: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204081.72306: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204081.72329: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204081.72372: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204081.72386: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204081.72399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204081.72533: Set connection var ansible_connection to ssh 12755 1727204081.72548: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204081.72556: Set connection var ansible_shell_type to sh 12755 1727204081.72575: Set connection var ansible_timeout to 10 12755 1727204081.72587: Set connection var ansible_shell_executable to /bin/sh 12755 1727204081.72696: Set connection var ansible_pipelining to False 12755 1727204081.72700: variable 'ansible_shell_executable' from source: unknown 12755 1727204081.72702: variable 'ansible_connection' from source: unknown 12755 1727204081.72705: variable 'ansible_module_compression' from source: unknown 12755 1727204081.72707: variable 'ansible_shell_type' from source: unknown 12755 1727204081.72709: variable 'ansible_shell_executable' from source: unknown 12755 1727204081.72711: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204081.72713: variable 'ansible_pipelining' from source: unknown 12755 1727204081.72718: variable 'ansible_timeout' from source: unknown 12755 1727204081.72720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204081.72836: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204081.72839: variable 'omit' from source: magic vars 12755 1727204081.72895: starting attempt loop 12755 1727204081.72899: running the handler 12755 1727204081.72902: variable 'ansible_facts' from source: unknown 12755 1727204081.72904: variable 'ansible_facts' from source: unknown 12755 1727204081.72923: _low_level_execute_command(): starting 12755 1727204081.72939: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204081.73700: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204081.73723: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204081.73834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204081.73861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204081.73878: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204081.73936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204081.74045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204081.75763: stdout chunk (state=3): >>>/root <<< 12755 1727204081.75940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204081.75968: stdout chunk (state=3): >>><<< 12755 1727204081.75983: stderr chunk (state=3): >>><<< 12755 1727204081.76123: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204081.76127: _low_level_execute_command(): starting 12755 1727204081.76131: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204081.7601397-13302-270132462145259 `" && echo ansible-tmp-1727204081.7601397-13302-270132462145259="` echo /root/.ansible/tmp/ansible-tmp-1727204081.7601397-13302-270132462145259 `" ) && sleep 0' 12755 1727204081.76748: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204081.76762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204081.76775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204081.76809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204081.76924: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204081.76963: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204081.77009: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204081.79142: stdout chunk (state=3): >>>ansible-tmp-1727204081.7601397-13302-270132462145259=/root/.ansible/tmp/ansible-tmp-1727204081.7601397-13302-270132462145259 <<< 12755 1727204081.79328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204081.79332: stdout chunk (state=3): >>><<< 12755 1727204081.79335: stderr chunk (state=3): >>><<< 12755 1727204081.79362: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204081.7601397-13302-270132462145259=/root/.ansible/tmp/ansible-tmp-1727204081.7601397-13302-270132462145259 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204081.79417: variable 'ansible_module_compression' from source: unknown 12755 1727204081.79493: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 12755 1727204081.79760: variable 'ansible_facts' from source: unknown 12755 1727204081.79764: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204081.7601397-13302-270132462145259/AnsiballZ_dnf.py 12755 1727204081.79914: Sending initial data 12755 1727204081.79917: Sent initial data (152 bytes) 12755 1727204081.80473: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204081.80496: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204081.80606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204081.80628: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204081.80647: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204081.80672: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204081.80750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204081.82420: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204081.82487: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204081.82535: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpfvasroxv /root/.ansible/tmp/ansible-tmp-1727204081.7601397-13302-270132462145259/AnsiballZ_dnf.py <<< 12755 1727204081.82551: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204081.7601397-13302-270132462145259/AnsiballZ_dnf.py" <<< 12755 1727204081.82576: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpfvasroxv" to remote "/root/.ansible/tmp/ansible-tmp-1727204081.7601397-13302-270132462145259/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204081.7601397-13302-270132462145259/AnsiballZ_dnf.py" <<< 12755 1727204081.84466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204081.84470: stderr chunk (state=3): >>><<< 12755 1727204081.84473: stdout chunk (state=3): >>><<< 12755 1727204081.84475: done transferring module to remote 12755 1727204081.84477: _low_level_execute_command(): starting 12755 1727204081.84480: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204081.7601397-13302-270132462145259/ /root/.ansible/tmp/ansible-tmp-1727204081.7601397-13302-270132462145259/AnsiballZ_dnf.py && sleep 0' 12755 1727204081.85132: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204081.85136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204081.85139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12755 1727204081.85142: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204081.85216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204081.85268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204081.87314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204081.87317: stdout chunk (state=3): >>><<< 12755 1727204081.87320: stderr chunk (state=3): >>><<< 12755 1727204081.87338: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204081.87395: _low_level_execute_command(): starting 12755 1727204081.87398: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204081.7601397-13302-270132462145259/AnsiballZ_dnf.py && sleep 0' 12755 1727204081.88025: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204081.88042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204081.88106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204081.88279: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204081.88299: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204081.88324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204081.88501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204083.37701: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 12755 1727204083.42288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204083.42600: stderr chunk (state=3): >>><<< 12755 1727204083.42604: stdout chunk (state=3): >>><<< 12755 1727204083.42608: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204083.42802: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204081.7601397-13302-270132462145259/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204083.42806: _low_level_execute_command(): starting 12755 1727204083.42808: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204081.7601397-13302-270132462145259/ > /dev/null 2>&1 && sleep 0' 12755 1727204083.44101: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204083.44227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204083.44350: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204083.44445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204083.46494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204083.46498: stdout chunk (state=3): >>><<< 12755 1727204083.46501: stderr chunk (state=3): >>><<< 12755 1727204083.46523: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204083.46538: handler run complete 12755 1727204083.47099: attempt loop complete, returning result 12755 1727204083.47103: _execute() done 12755 1727204083.47105: dumping result to json 12755 1727204083.47108: done dumping result, returning 12755 1727204083.47110: done running TaskExecutor() for managed-node1/TASK: Install pgrep, sysctl [12b410aa-8751-72e9-1a19-000000000011] 12755 1727204083.47112: sending task result for task 12b410aa-8751-72e9-1a19-000000000011 12755 1727204083.47202: done sending task result for task 12b410aa-8751-72e9-1a19-000000000011 12755 1727204083.47206: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 12755 1727204083.47311: no more pending results, returning what we have 12755 1727204083.47318: results queue empty 12755 1727204083.47320: checking for any_errors_fatal 12755 1727204083.47328: done checking for any_errors_fatal 12755 1727204083.47329: checking for max_fail_percentage 12755 1727204083.47331: done checking for max_fail_percentage 12755 1727204083.47332: checking to see if all hosts have failed and the running result is not ok 12755 1727204083.47333: done checking to see if all hosts have failed 12755 1727204083.47334: getting the remaining hosts for this loop 12755 1727204083.47336: done getting the remaining hosts for this loop 12755 1727204083.47341: getting the next task for host managed-node1 12755 1727204083.47349: done getting next task for host managed-node1 12755 1727204083.47353: ^ task is: TASK: Create test interfaces 12755 1727204083.47356: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204083.47361: getting variables 12755 1727204083.47363: in VariableManager get_vars() 12755 1727204083.47829: Calling all_inventory to load vars for managed-node1 12755 1727204083.47833: Calling groups_inventory to load vars for managed-node1 12755 1727204083.47836: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204083.47848: Calling all_plugins_play to load vars for managed-node1 12755 1727204083.47852: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204083.47856: Calling groups_plugins_play to load vars for managed-node1 12755 1727204083.48397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204083.49103: done with get_vars() 12755 1727204083.49120: done getting variables 12755 1727204083.49227: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Tuesday 24 September 2024 14:54:43 -0400 (0:00:01.828) 0:00:08.728 ***** 12755 1727204083.49263: entering _queue_task() for managed-node1/shell 12755 1727204083.49266: Creating lock for shell 12755 1727204083.50193: worker is 1 (out of 1 available) 12755 1727204083.50205: exiting _queue_task() for managed-node1/shell 12755 1727204083.50219: done queuing things up, now waiting for results queue to drain 12755 1727204083.50221: waiting for pending results... 12755 1727204083.50435: running TaskExecutor() for managed-node1/TASK: Create test interfaces 12755 1727204083.50595: in run() - task 12b410aa-8751-72e9-1a19-000000000012 12755 1727204083.50599: variable 'ansible_search_path' from source: unknown 12755 1727204083.50602: variable 'ansible_search_path' from source: unknown 12755 1727204083.51426: calling self._execute() 12755 1727204083.51536: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204083.51539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204083.51542: variable 'omit' from source: magic vars 12755 1727204083.52496: variable 'ansible_distribution_major_version' from source: facts 12755 1727204083.52499: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204083.52502: variable 'omit' from source: magic vars 12755 1727204083.52532: variable 'omit' from source: magic vars 12755 1727204083.53325: variable 'dhcp_interface1' from source: play vars 12755 1727204083.53339: variable 'dhcp_interface2' from source: play vars 12755 1727204083.53387: variable 'omit' from source: magic vars 12755 1727204083.53441: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204083.53737: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204083.53795: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204083.53958: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204083.53962: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204083.53964: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204083.53967: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204083.53969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204083.54068: Set connection var ansible_connection to ssh 12755 1727204083.54208: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204083.54218: Set connection var ansible_shell_type to sh 12755 1727204083.54240: Set connection var ansible_timeout to 10 12755 1727204083.54254: Set connection var ansible_shell_executable to /bin/sh 12755 1727204083.54294: Set connection var ansible_pipelining to False 12755 1727204083.54335: variable 'ansible_shell_executable' from source: unknown 12755 1727204083.54402: variable 'ansible_connection' from source: unknown 12755 1727204083.54416: variable 'ansible_module_compression' from source: unknown 12755 1727204083.54426: variable 'ansible_shell_type' from source: unknown 12755 1727204083.54435: variable 'ansible_shell_executable' from source: unknown 12755 1727204083.54443: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204083.54452: variable 'ansible_pipelining' from source: unknown 12755 1727204083.54460: variable 'ansible_timeout' from source: unknown 12755 1727204083.54745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204083.54844: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204083.55035: variable 'omit' from source: magic vars 12755 1727204083.55038: starting attempt loop 12755 1727204083.55040: running the handler 12755 1727204083.55043: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204083.55046: _low_level_execute_command(): starting 12755 1727204083.55055: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204083.56808: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204083.56832: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204083.56938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204083.57082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204083.57153: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204083.58938: stdout chunk (state=3): >>>/root <<< 12755 1727204083.59054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204083.59136: stderr chunk (state=3): >>><<< 12755 1727204083.59150: stdout chunk (state=3): >>><<< 12755 1727204083.59199: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204083.59795: _low_level_execute_command(): starting 12755 1727204083.59799: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204083.5923407-13364-26107398118841 `" && echo ansible-tmp-1727204083.5923407-13364-26107398118841="` echo /root/.ansible/tmp/ansible-tmp-1727204083.5923407-13364-26107398118841 `" ) && sleep 0' 12755 1727204083.60581: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204083.60620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204083.62757: stdout chunk (state=3): >>>ansible-tmp-1727204083.5923407-13364-26107398118841=/root/.ansible/tmp/ansible-tmp-1727204083.5923407-13364-26107398118841 <<< 12755 1727204083.62876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204083.63028: stderr chunk (state=3): >>><<< 12755 1727204083.63079: stdout chunk (state=3): >>><<< 12755 1727204083.63108: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204083.5923407-13364-26107398118841=/root/.ansible/tmp/ansible-tmp-1727204083.5923407-13364-26107398118841 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204083.63338: variable 'ansible_module_compression' from source: unknown 12755 1727204083.63498: ANSIBALLZ: Using generic lock for ansible.legacy.command 12755 1727204083.63501: ANSIBALLZ: Acquiring lock 12755 1727204083.63504: ANSIBALLZ: Lock acquired: 139630693732560 12755 1727204083.63507: ANSIBALLZ: Creating module 12755 1727204083.91526: ANSIBALLZ: Writing module into payload 12755 1727204083.91660: ANSIBALLZ: Writing module 12755 1727204083.91922: ANSIBALLZ: Renaming module 12755 1727204083.91937: ANSIBALLZ: Done creating module 12755 1727204083.91962: variable 'ansible_facts' from source: unknown 12755 1727204083.92068: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204083.5923407-13364-26107398118841/AnsiballZ_command.py 12755 1727204083.92724: Sending initial data 12755 1727204083.92728: Sent initial data (155 bytes) 12755 1727204083.94257: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204083.94272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 12755 1727204083.94285: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204083.94466: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204083.94605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204083.94625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204083.96432: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204083.96470: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204083.96507: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpbca1mr_s /root/.ansible/tmp/ansible-tmp-1727204083.5923407-13364-26107398118841/AnsiballZ_command.py <<< 12755 1727204083.96511: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204083.5923407-13364-26107398118841/AnsiballZ_command.py" <<< 12755 1727204083.96570: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpbca1mr_s" to remote "/root/.ansible/tmp/ansible-tmp-1727204083.5923407-13364-26107398118841/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204083.5923407-13364-26107398118841/AnsiballZ_command.py" <<< 12755 1727204083.98873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204083.98903: stderr chunk (state=3): >>><<< 12755 1727204083.98907: stdout chunk (state=3): >>><<< 12755 1727204083.99030: done transferring module to remote 12755 1727204083.99045: _low_level_execute_command(): starting 12755 1727204083.99051: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204083.5923407-13364-26107398118841/ /root/.ansible/tmp/ansible-tmp-1727204083.5923407-13364-26107398118841/AnsiballZ_command.py && sleep 0' 12755 1727204084.00303: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204084.00310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204084.00330: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204084.00470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204084.00606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204084.00609: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204084.00642: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204084.00801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204084.02843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204084.02852: stdout chunk (state=3): >>><<< 12755 1727204084.02854: stderr chunk (state=3): >>><<< 12755 1727204084.02874: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204084.02885: _low_level_execute_command(): starting 12755 1727204084.02900: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204083.5923407-13364-26107398118841/AnsiballZ_command.py && sleep 0' 12755 1727204084.03951: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204084.03969: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204084.03984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204084.04024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204084.04107: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204084.04169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204084.04201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204084.04342: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204085.48594: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 651 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 651 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! <<< 12755 1727204085.48632: stdout chunk (state=3): >>>firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:54:44.218829", "end": "2024-09-24 14:54:45.484636", "delta": "0:00:01.265807", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12755 1727204085.50458: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204085.50462: stdout chunk (state=3): >>><<< 12755 1727204085.50465: stderr chunk (state=3): >>><<< 12755 1727204085.50501: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 651 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 651 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:54:44.218829", "end": "2024-09-24 14:54:45.484636", "delta": "0:00:01.265807", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204085.50595: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204083.5923407-13364-26107398118841/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204085.50626: _low_level_execute_command(): starting 12755 1727204085.50629: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204083.5923407-13364-26107398118841/ > /dev/null 2>&1 && sleep 0' 12755 1727204085.51322: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204085.51339: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204085.51367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204085.51387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204085.51479: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204085.51522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204085.51544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204085.51577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204085.51654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204085.53791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204085.53795: stdout chunk (state=3): >>><<< 12755 1727204085.53797: stderr chunk (state=3): >>><<< 12755 1727204085.53820: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204085.53994: handler run complete 12755 1727204085.53997: Evaluated conditional (False): False 12755 1727204085.53999: attempt loop complete, returning result 12755 1727204085.54001: _execute() done 12755 1727204085.54003: dumping result to json 12755 1727204085.54005: done dumping result, returning 12755 1727204085.54007: done running TaskExecutor() for managed-node1/TASK: Create test interfaces [12b410aa-8751-72e9-1a19-000000000012] 12755 1727204085.54009: sending task result for task 12b410aa-8751-72e9-1a19-000000000012 12755 1727204085.54085: done sending task result for task 12b410aa-8751-72e9-1a19-000000000012 12755 1727204085.54088: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.265807", "end": "2024-09-24 14:54:45.484636", "rc": 0, "start": "2024-09-24 14:54:44.218829" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 651 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 651 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 12755 1727204085.54377: no more pending results, returning what we have 12755 1727204085.54381: results queue empty 12755 1727204085.54382: checking for any_errors_fatal 12755 1727204085.54391: done checking for any_errors_fatal 12755 1727204085.54392: checking for max_fail_percentage 12755 1727204085.54394: done checking for max_fail_percentage 12755 1727204085.54395: checking to see if all hosts have failed and the running result is not ok 12755 1727204085.54396: done checking to see if all hosts have failed 12755 1727204085.54397: getting the remaining hosts for this loop 12755 1727204085.54398: done getting the remaining hosts for this loop 12755 1727204085.54402: getting the next task for host managed-node1 12755 1727204085.54411: done getting next task for host managed-node1 12755 1727204085.54415: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12755 1727204085.54418: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204085.54422: getting variables 12755 1727204085.54423: in VariableManager get_vars() 12755 1727204085.54477: Calling all_inventory to load vars for managed-node1 12755 1727204085.54480: Calling groups_inventory to load vars for managed-node1 12755 1727204085.54483: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204085.54500: Calling all_plugins_play to load vars for managed-node1 12755 1727204085.54504: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204085.54508: Calling groups_plugins_play to load vars for managed-node1 12755 1727204085.54743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204085.55048: done with get_vars() 12755 1727204085.55061: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:45 -0400 (0:00:02.059) 0:00:10.787 ***** 12755 1727204085.55180: entering _queue_task() for managed-node1/include_tasks 12755 1727204085.55588: worker is 1 (out of 1 available) 12755 1727204085.55604: exiting _queue_task() for managed-node1/include_tasks 12755 1727204085.55616: done queuing things up, now waiting for results queue to drain 12755 1727204085.55618: waiting for pending results... 12755 1727204085.56022: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 12755 1727204085.56028: in run() - task 12b410aa-8751-72e9-1a19-000000000016 12755 1727204085.56032: variable 'ansible_search_path' from source: unknown 12755 1727204085.56035: variable 'ansible_search_path' from source: unknown 12755 1727204085.56041: calling self._execute() 12755 1727204085.56155: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204085.56170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204085.56188: variable 'omit' from source: magic vars 12755 1727204085.56731: variable 'ansible_distribution_major_version' from source: facts 12755 1727204085.56754: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204085.56775: _execute() done 12755 1727204085.56786: dumping result to json 12755 1727204085.56796: done dumping result, returning 12755 1727204085.56809: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-72e9-1a19-000000000016] 12755 1727204085.56820: sending task result for task 12b410aa-8751-72e9-1a19-000000000016 12755 1727204085.57081: no more pending results, returning what we have 12755 1727204085.57088: in VariableManager get_vars() 12755 1727204085.57161: Calling all_inventory to load vars for managed-node1 12755 1727204085.57166: Calling groups_inventory to load vars for managed-node1 12755 1727204085.57169: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204085.57186: Calling all_plugins_play to load vars for managed-node1 12755 1727204085.57191: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204085.57196: Calling groups_plugins_play to load vars for managed-node1 12755 1727204085.57614: done sending task result for task 12b410aa-8751-72e9-1a19-000000000016 12755 1727204085.57617: WORKER PROCESS EXITING 12755 1727204085.57650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204085.57937: done with get_vars() 12755 1727204085.57952: variable 'ansible_search_path' from source: unknown 12755 1727204085.57954: variable 'ansible_search_path' from source: unknown 12755 1727204085.58005: we have included files to process 12755 1727204085.58007: generating all_blocks data 12755 1727204085.58008: done generating all_blocks data 12755 1727204085.58009: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12755 1727204085.58011: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12755 1727204085.58013: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12755 1727204085.58299: done processing included file 12755 1727204085.58302: iterating over new_blocks loaded from include file 12755 1727204085.58303: in VariableManager get_vars() 12755 1727204085.58336: done with get_vars() 12755 1727204085.58338: filtering new block on tags 12755 1727204085.58359: done filtering new block on tags 12755 1727204085.58362: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 12755 1727204085.58367: extending task lists for all hosts with included blocks 12755 1727204085.58505: done extending task lists 12755 1727204085.58507: done processing included files 12755 1727204085.58508: results queue empty 12755 1727204085.58509: checking for any_errors_fatal 12755 1727204085.58514: done checking for any_errors_fatal 12755 1727204085.58515: checking for max_fail_percentage 12755 1727204085.58517: done checking for max_fail_percentage 12755 1727204085.58518: checking to see if all hosts have failed and the running result is not ok 12755 1727204085.58519: done checking to see if all hosts have failed 12755 1727204085.58520: getting the remaining hosts for this loop 12755 1727204085.58521: done getting the remaining hosts for this loop 12755 1727204085.58524: getting the next task for host managed-node1 12755 1727204085.58529: done getting next task for host managed-node1 12755 1727204085.58532: ^ task is: TASK: Get stat for interface {{ interface }} 12755 1727204085.58536: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204085.58538: getting variables 12755 1727204085.58539: in VariableManager get_vars() 12755 1727204085.58562: Calling all_inventory to load vars for managed-node1 12755 1727204085.58565: Calling groups_inventory to load vars for managed-node1 12755 1727204085.58568: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204085.58573: Calling all_plugins_play to load vars for managed-node1 12755 1727204085.58576: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204085.58580: Calling groups_plugins_play to load vars for managed-node1 12755 1727204085.58787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204085.59107: done with get_vars() 12755 1727204085.59118: done getting variables 12755 1727204085.59316: variable 'interface' from source: task vars 12755 1727204085.59321: variable 'dhcp_interface1' from source: play vars 12755 1727204085.59409: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.042) 0:00:10.830 ***** 12755 1727204085.59459: entering _queue_task() for managed-node1/stat 12755 1727204085.59751: worker is 1 (out of 1 available) 12755 1727204085.59765: exiting _queue_task() for managed-node1/stat 12755 1727204085.59778: done queuing things up, now waiting for results queue to drain 12755 1727204085.59780: waiting for pending results... 12755 1727204085.60060: running TaskExecutor() for managed-node1/TASK: Get stat for interface test1 12755 1727204085.60231: in run() - task 12b410aa-8751-72e9-1a19-000000000248 12755 1727204085.60256: variable 'ansible_search_path' from source: unknown 12755 1727204085.60265: variable 'ansible_search_path' from source: unknown 12755 1727204085.60309: calling self._execute() 12755 1727204085.60418: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204085.60431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204085.60444: variable 'omit' from source: magic vars 12755 1727204085.60899: variable 'ansible_distribution_major_version' from source: facts 12755 1727204085.60903: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204085.60906: variable 'omit' from source: magic vars 12755 1727204085.60969: variable 'omit' from source: magic vars 12755 1727204085.61097: variable 'interface' from source: task vars 12755 1727204085.61108: variable 'dhcp_interface1' from source: play vars 12755 1727204085.61197: variable 'dhcp_interface1' from source: play vars 12755 1727204085.61336: variable 'omit' from source: magic vars 12755 1727204085.61339: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204085.61342: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204085.61353: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204085.61378: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204085.61401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204085.61447: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204085.61457: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204085.61465: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204085.61600: Set connection var ansible_connection to ssh 12755 1727204085.61615: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204085.61623: Set connection var ansible_shell_type to sh 12755 1727204085.61642: Set connection var ansible_timeout to 10 12755 1727204085.61660: Set connection var ansible_shell_executable to /bin/sh 12755 1727204085.61677: Set connection var ansible_pipelining to False 12755 1727204085.61709: variable 'ansible_shell_executable' from source: unknown 12755 1727204085.61719: variable 'ansible_connection' from source: unknown 12755 1727204085.61729: variable 'ansible_module_compression' from source: unknown 12755 1727204085.61737: variable 'ansible_shell_type' from source: unknown 12755 1727204085.61744: variable 'ansible_shell_executable' from source: unknown 12755 1727204085.61751: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204085.61759: variable 'ansible_pipelining' from source: unknown 12755 1727204085.61770: variable 'ansible_timeout' from source: unknown 12755 1727204085.61794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204085.62035: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204085.62097: variable 'omit' from source: magic vars 12755 1727204085.62101: starting attempt loop 12755 1727204085.62108: running the handler 12755 1727204085.62111: _low_level_execute_command(): starting 12755 1727204085.62113: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204085.62993: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204085.63054: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204085.63073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204085.63166: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204085.64949: stdout chunk (state=3): >>>/root <<< 12755 1727204085.65296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204085.65300: stderr chunk (state=3): >>><<< 12755 1727204085.65302: stdout chunk (state=3): >>><<< 12755 1727204085.65304: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204085.65307: _low_level_execute_command(): starting 12755 1727204085.65310: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204085.651596-13442-121780150881158 `" && echo ansible-tmp-1727204085.651596-13442-121780150881158="` echo /root/.ansible/tmp/ansible-tmp-1727204085.651596-13442-121780150881158 `" ) && sleep 0' 12755 1727204085.65868: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204085.65877: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204085.65908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204085.65947: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204085.65960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204085.66093: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204085.66102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204085.66295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204085.68206: stdout chunk (state=3): >>>ansible-tmp-1727204085.651596-13442-121780150881158=/root/.ansible/tmp/ansible-tmp-1727204085.651596-13442-121780150881158 <<< 12755 1727204085.68342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204085.68427: stderr chunk (state=3): >>><<< 12755 1727204085.68453: stdout chunk (state=3): >>><<< 12755 1727204085.68479: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204085.651596-13442-121780150881158=/root/.ansible/tmp/ansible-tmp-1727204085.651596-13442-121780150881158 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204085.68553: variable 'ansible_module_compression' from source: unknown 12755 1727204085.68621: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12755 1727204085.68676: variable 'ansible_facts' from source: unknown 12755 1727204085.68792: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204085.651596-13442-121780150881158/AnsiballZ_stat.py 12755 1727204085.69011: Sending initial data 12755 1727204085.69015: Sent initial data (152 bytes) 12755 1727204085.69686: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204085.69799: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204085.69841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204085.69880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204085.71642: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204085.71647: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204085.71706: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpwvpk6xtj /root/.ansible/tmp/ansible-tmp-1727204085.651596-13442-121780150881158/AnsiballZ_stat.py <<< 12755 1727204085.71710: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204085.651596-13442-121780150881158/AnsiballZ_stat.py" <<< 12755 1727204085.71744: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpwvpk6xtj" to remote "/root/.ansible/tmp/ansible-tmp-1727204085.651596-13442-121780150881158/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204085.651596-13442-121780150881158/AnsiballZ_stat.py" <<< 12755 1727204085.72885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204085.73005: stderr chunk (state=3): >>><<< 12755 1727204085.73018: stdout chunk (state=3): >>><<< 12755 1727204085.73079: done transferring module to remote 12755 1727204085.73082: _low_level_execute_command(): starting 12755 1727204085.73086: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204085.651596-13442-121780150881158/ /root/.ansible/tmp/ansible-tmp-1727204085.651596-13442-121780150881158/AnsiballZ_stat.py && sleep 0' 12755 1727204085.73849: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204085.73944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204085.73962: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204085.74007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204085.74029: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204085.74091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204085.74123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204085.76349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204085.76357: stdout chunk (state=3): >>><<< 12755 1727204085.76360: stderr chunk (state=3): >>><<< 12755 1727204085.76582: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204085.76586: _low_level_execute_command(): starting 12755 1727204085.76591: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204085.651596-13442-121780150881158/AnsiballZ_stat.py && sleep 0' 12755 1727204085.77141: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204085.77156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204085.77169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204085.77203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204085.77215: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12755 1727204085.77308: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 12755 1727204085.77337: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204085.77356: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204085.77446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204085.95252: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35366, "dev": 23, "nlink": 1, "atime": 1727204084.2258165, "mtime": 1727204084.2258165, "ctime": 1727204084.2258165, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12755 1727204085.96972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204085.96976: stdout chunk (state=3): >>><<< 12755 1727204085.96978: stderr chunk (state=3): >>><<< 12755 1727204085.97001: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35366, "dev": 23, "nlink": 1, "atime": 1727204084.2258165, "mtime": 1727204084.2258165, "ctime": 1727204084.2258165, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204085.97100: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204085.651596-13442-121780150881158/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204085.97202: _low_level_execute_command(): starting 12755 1727204085.97206: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204085.651596-13442-121780150881158/ > /dev/null 2>&1 && sleep 0' 12755 1727204085.97825: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204085.97854: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204085.97956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204085.97974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204085.98019: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204085.98046: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204085.98103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204085.98149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204086.00213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204086.00270: stderr chunk (state=3): >>><<< 12755 1727204086.00283: stdout chunk (state=3): >>><<< 12755 1727204086.00315: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204086.00500: handler run complete 12755 1727204086.00503: attempt loop complete, returning result 12755 1727204086.00506: _execute() done 12755 1727204086.00508: dumping result to json 12755 1727204086.00510: done dumping result, returning 12755 1727204086.00512: done running TaskExecutor() for managed-node1/TASK: Get stat for interface test1 [12b410aa-8751-72e9-1a19-000000000248] 12755 1727204086.00515: sending task result for task 12b410aa-8751-72e9-1a19-000000000248 ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727204084.2258165, "block_size": 4096, "blocks": 0, "ctime": 1727204084.2258165, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 35366, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1727204084.2258165, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 12755 1727204086.00857: no more pending results, returning what we have 12755 1727204086.00862: results queue empty 12755 1727204086.00863: checking for any_errors_fatal 12755 1727204086.00865: done checking for any_errors_fatal 12755 1727204086.00866: checking for max_fail_percentage 12755 1727204086.00868: done checking for max_fail_percentage 12755 1727204086.00870: checking to see if all hosts have failed and the running result is not ok 12755 1727204086.00871: done checking to see if all hosts have failed 12755 1727204086.00872: getting the remaining hosts for this loop 12755 1727204086.00873: done getting the remaining hosts for this loop 12755 1727204086.00880: getting the next task for host managed-node1 12755 1727204086.00996: done getting next task for host managed-node1 12755 1727204086.01007: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 12755 1727204086.01012: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204086.01021: getting variables 12755 1727204086.01023: in VariableManager get_vars() 12755 1727204086.01087: Calling all_inventory to load vars for managed-node1 12755 1727204086.01093: Calling groups_inventory to load vars for managed-node1 12755 1727204086.01096: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204086.01229: Calling all_plugins_play to load vars for managed-node1 12755 1727204086.01234: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204086.01241: done sending task result for task 12b410aa-8751-72e9-1a19-000000000248 12755 1727204086.01245: WORKER PROCESS EXITING 12755 1727204086.01250: Calling groups_plugins_play to load vars for managed-node1 12755 1727204086.01569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204086.01892: done with get_vars() 12755 1727204086.01905: done getting variables 12755 1727204086.02024: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 12755 1727204086.02168: variable 'interface' from source: task vars 12755 1727204086.02172: variable 'dhcp_interface1' from source: play vars 12755 1727204086.02255: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.428) 0:00:11.258 ***** 12755 1727204086.02294: entering _queue_task() for managed-node1/assert 12755 1727204086.02297: Creating lock for assert 12755 1727204086.02673: worker is 1 (out of 1 available) 12755 1727204086.02687: exiting _queue_task() for managed-node1/assert 12755 1727204086.02701: done queuing things up, now waiting for results queue to drain 12755 1727204086.02702: waiting for pending results... 12755 1727204086.02959: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'test1' 12755 1727204086.03125: in run() - task 12b410aa-8751-72e9-1a19-000000000017 12755 1727204086.03146: variable 'ansible_search_path' from source: unknown 12755 1727204086.03157: variable 'ansible_search_path' from source: unknown 12755 1727204086.03209: calling self._execute() 12755 1727204086.03326: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204086.03341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204086.03358: variable 'omit' from source: magic vars 12755 1727204086.03934: variable 'ansible_distribution_major_version' from source: facts 12755 1727204086.03959: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204086.03978: variable 'omit' from source: magic vars 12755 1727204086.04044: variable 'omit' from source: magic vars 12755 1727204086.04171: variable 'interface' from source: task vars 12755 1727204086.04287: variable 'dhcp_interface1' from source: play vars 12755 1727204086.04290: variable 'dhcp_interface1' from source: play vars 12755 1727204086.04295: variable 'omit' from source: magic vars 12755 1727204086.04353: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204086.04402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204086.04442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204086.04469: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204086.04487: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204086.04538: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204086.04547: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204086.04633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204086.04692: Set connection var ansible_connection to ssh 12755 1727204086.04707: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204086.04718: Set connection var ansible_shell_type to sh 12755 1727204086.04747: Set connection var ansible_timeout to 10 12755 1727204086.04761: Set connection var ansible_shell_executable to /bin/sh 12755 1727204086.04774: Set connection var ansible_pipelining to False 12755 1727204086.04806: variable 'ansible_shell_executable' from source: unknown 12755 1727204086.04819: variable 'ansible_connection' from source: unknown 12755 1727204086.04828: variable 'ansible_module_compression' from source: unknown 12755 1727204086.04837: variable 'ansible_shell_type' from source: unknown 12755 1727204086.04852: variable 'ansible_shell_executable' from source: unknown 12755 1727204086.04861: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204086.04871: variable 'ansible_pipelining' from source: unknown 12755 1727204086.04960: variable 'ansible_timeout' from source: unknown 12755 1727204086.04964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204086.05073: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204086.05094: variable 'omit' from source: magic vars 12755 1727204086.05104: starting attempt loop 12755 1727204086.05112: running the handler 12755 1727204086.05295: variable 'interface_stat' from source: set_fact 12755 1727204086.05328: Evaluated conditional (interface_stat.stat.exists): True 12755 1727204086.05341: handler run complete 12755 1727204086.05364: attempt loop complete, returning result 12755 1727204086.05372: _execute() done 12755 1727204086.05381: dumping result to json 12755 1727204086.05395: done dumping result, returning 12755 1727204086.05413: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'test1' [12b410aa-8751-72e9-1a19-000000000017] 12755 1727204086.05427: sending task result for task 12b410aa-8751-72e9-1a19-000000000017 12755 1727204086.05658: done sending task result for task 12b410aa-8751-72e9-1a19-000000000017 12755 1727204086.05661: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12755 1727204086.05721: no more pending results, returning what we have 12755 1727204086.05725: results queue empty 12755 1727204086.05727: checking for any_errors_fatal 12755 1727204086.05739: done checking for any_errors_fatal 12755 1727204086.05740: checking for max_fail_percentage 12755 1727204086.05742: done checking for max_fail_percentage 12755 1727204086.05744: checking to see if all hosts have failed and the running result is not ok 12755 1727204086.05746: done checking to see if all hosts have failed 12755 1727204086.05746: getting the remaining hosts for this loop 12755 1727204086.05748: done getting the remaining hosts for this loop 12755 1727204086.05753: getting the next task for host managed-node1 12755 1727204086.05764: done getting next task for host managed-node1 12755 1727204086.05767: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12755 1727204086.05898: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204086.05904: getting variables 12755 1727204086.05906: in VariableManager get_vars() 12755 1727204086.06070: Calling all_inventory to load vars for managed-node1 12755 1727204086.06074: Calling groups_inventory to load vars for managed-node1 12755 1727204086.06078: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204086.06088: Calling all_plugins_play to load vars for managed-node1 12755 1727204086.06094: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204086.06099: Calling groups_plugins_play to load vars for managed-node1 12755 1727204086.06333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204086.06633: done with get_vars() 12755 1727204086.06650: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.044) 0:00:11.303 ***** 12755 1727204086.06766: entering _queue_task() for managed-node1/include_tasks 12755 1727204086.07059: worker is 1 (out of 1 available) 12755 1727204086.07075: exiting _queue_task() for managed-node1/include_tasks 12755 1727204086.07202: done queuing things up, now waiting for results queue to drain 12755 1727204086.07205: waiting for pending results... 12755 1727204086.07512: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 12755 1727204086.07565: in run() - task 12b410aa-8751-72e9-1a19-00000000001b 12755 1727204086.07585: variable 'ansible_search_path' from source: unknown 12755 1727204086.07598: variable 'ansible_search_path' from source: unknown 12755 1727204086.07667: calling self._execute() 12755 1727204086.07853: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204086.07856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204086.07859: variable 'omit' from source: magic vars 12755 1727204086.08249: variable 'ansible_distribution_major_version' from source: facts 12755 1727204086.08270: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204086.08287: _execute() done 12755 1727204086.08300: dumping result to json 12755 1727204086.08309: done dumping result, returning 12755 1727204086.08323: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-72e9-1a19-00000000001b] 12755 1727204086.08397: sending task result for task 12b410aa-8751-72e9-1a19-00000000001b 12755 1727204086.08474: done sending task result for task 12b410aa-8751-72e9-1a19-00000000001b 12755 1727204086.08477: WORKER PROCESS EXITING 12755 1727204086.08509: no more pending results, returning what we have 12755 1727204086.08515: in VariableManager get_vars() 12755 1727204086.08581: Calling all_inventory to load vars for managed-node1 12755 1727204086.08585: Calling groups_inventory to load vars for managed-node1 12755 1727204086.08588: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204086.08607: Calling all_plugins_play to load vars for managed-node1 12755 1727204086.08611: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204086.08615: Calling groups_plugins_play to load vars for managed-node1 12755 1727204086.09041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204086.09364: done with get_vars() 12755 1727204086.09374: variable 'ansible_search_path' from source: unknown 12755 1727204086.09376: variable 'ansible_search_path' from source: unknown 12755 1727204086.09424: we have included files to process 12755 1727204086.09426: generating all_blocks data 12755 1727204086.09427: done generating all_blocks data 12755 1727204086.09432: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12755 1727204086.09433: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12755 1727204086.09436: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12755 1727204086.09715: done processing included file 12755 1727204086.09720: iterating over new_blocks loaded from include file 12755 1727204086.09722: in VariableManager get_vars() 12755 1727204086.09755: done with get_vars() 12755 1727204086.09757: filtering new block on tags 12755 1727204086.09776: done filtering new block on tags 12755 1727204086.09779: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 12755 1727204086.09790: extending task lists for all hosts with included blocks 12755 1727204086.09928: done extending task lists 12755 1727204086.09930: done processing included files 12755 1727204086.09931: results queue empty 12755 1727204086.09932: checking for any_errors_fatal 12755 1727204086.09935: done checking for any_errors_fatal 12755 1727204086.09936: checking for max_fail_percentage 12755 1727204086.09938: done checking for max_fail_percentage 12755 1727204086.09939: checking to see if all hosts have failed and the running result is not ok 12755 1727204086.09940: done checking to see if all hosts have failed 12755 1727204086.09941: getting the remaining hosts for this loop 12755 1727204086.09942: done getting the remaining hosts for this loop 12755 1727204086.09945: getting the next task for host managed-node1 12755 1727204086.09951: done getting next task for host managed-node1 12755 1727204086.09954: ^ task is: TASK: Get stat for interface {{ interface }} 12755 1727204086.09957: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204086.09960: getting variables 12755 1727204086.09961: in VariableManager get_vars() 12755 1727204086.09984: Calling all_inventory to load vars for managed-node1 12755 1727204086.09987: Calling groups_inventory to load vars for managed-node1 12755 1727204086.09992: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204086.09998: Calling all_plugins_play to load vars for managed-node1 12755 1727204086.10006: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204086.10010: Calling groups_plugins_play to load vars for managed-node1 12755 1727204086.10209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204086.10781: done with get_vars() 12755 1727204086.10795: done getting variables 12755 1727204086.11207: variable 'interface' from source: task vars 12755 1727204086.11212: variable 'dhcp_interface2' from source: play vars 12755 1727204086.11287: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.046) 0:00:11.349 ***** 12755 1727204086.11410: entering _queue_task() for managed-node1/stat 12755 1727204086.11888: worker is 1 (out of 1 available) 12755 1727204086.11908: exiting _queue_task() for managed-node1/stat 12755 1727204086.11926: done queuing things up, now waiting for results queue to drain 12755 1727204086.11927: waiting for pending results... 12755 1727204086.12309: running TaskExecutor() for managed-node1/TASK: Get stat for interface test2 12755 1727204086.12318: in run() - task 12b410aa-8751-72e9-1a19-000000000260 12755 1727204086.12330: variable 'ansible_search_path' from source: unknown 12755 1727204086.12338: variable 'ansible_search_path' from source: unknown 12755 1727204086.12385: calling self._execute() 12755 1727204086.12497: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204086.12511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204086.12528: variable 'omit' from source: magic vars 12755 1727204086.12943: variable 'ansible_distribution_major_version' from source: facts 12755 1727204086.12964: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204086.13081: variable 'omit' from source: magic vars 12755 1727204086.13085: variable 'omit' from source: magic vars 12755 1727204086.13170: variable 'interface' from source: task vars 12755 1727204086.13181: variable 'dhcp_interface2' from source: play vars 12755 1727204086.13264: variable 'dhcp_interface2' from source: play vars 12755 1727204086.13293: variable 'omit' from source: magic vars 12755 1727204086.13348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204086.13394: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204086.13429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204086.13525: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204086.13544: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204086.13626: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204086.13629: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204086.13632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204086.13725: Set connection var ansible_connection to ssh 12755 1727204086.13743: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204086.13751: Set connection var ansible_shell_type to sh 12755 1727204086.13769: Set connection var ansible_timeout to 10 12755 1727204086.13780: Set connection var ansible_shell_executable to /bin/sh 12755 1727204086.13792: Set connection var ansible_pipelining to False 12755 1727204086.13822: variable 'ansible_shell_executable' from source: unknown 12755 1727204086.13841: variable 'ansible_connection' from source: unknown 12755 1727204086.13844: variable 'ansible_module_compression' from source: unknown 12755 1727204086.13846: variable 'ansible_shell_type' from source: unknown 12755 1727204086.13899: variable 'ansible_shell_executable' from source: unknown 12755 1727204086.13902: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204086.13904: variable 'ansible_pipelining' from source: unknown 12755 1727204086.13907: variable 'ansible_timeout' from source: unknown 12755 1727204086.13951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204086.14274: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204086.14701: variable 'omit' from source: magic vars 12755 1727204086.14704: starting attempt loop 12755 1727204086.14707: running the handler 12755 1727204086.14709: _low_level_execute_command(): starting 12755 1727204086.14712: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204086.15460: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204086.15587: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204086.15835: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204086.15880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204086.17641: stdout chunk (state=3): >>>/root <<< 12755 1727204086.17841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204086.17844: stdout chunk (state=3): >>><<< 12755 1727204086.17847: stderr chunk (state=3): >>><<< 12755 1727204086.17868: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204086.17892: _low_level_execute_command(): starting 12755 1727204086.18088: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204086.178756-13462-30160059800920 `" && echo ansible-tmp-1727204086.178756-13462-30160059800920="` echo /root/.ansible/tmp/ansible-tmp-1727204086.178756-13462-30160059800920 `" ) && sleep 0' 12755 1727204086.19075: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204086.19102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204086.19114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204086.19412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204086.19415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204086.19482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204086.21529: stdout chunk (state=3): >>>ansible-tmp-1727204086.178756-13462-30160059800920=/root/.ansible/tmp/ansible-tmp-1727204086.178756-13462-30160059800920 <<< 12755 1727204086.21649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204086.21810: stderr chunk (state=3): >>><<< 12755 1727204086.22003: stdout chunk (state=3): >>><<< 12755 1727204086.22007: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204086.178756-13462-30160059800920=/root/.ansible/tmp/ansible-tmp-1727204086.178756-13462-30160059800920 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204086.22010: variable 'ansible_module_compression' from source: unknown 12755 1727204086.22177: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12755 1727204086.22261: variable 'ansible_facts' from source: unknown 12755 1727204086.22596: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204086.178756-13462-30160059800920/AnsiballZ_stat.py 12755 1727204086.23025: Sending initial data 12755 1727204086.23037: Sent initial data (151 bytes) 12755 1727204086.24147: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204086.24296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204086.24299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204086.24302: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204086.24305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204086.24309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204086.24363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204086.24481: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204086.24548: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204086.26322: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204086.26363: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204086.26612: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpiec_gpev /root/.ansible/tmp/ansible-tmp-1727204086.178756-13462-30160059800920/AnsiballZ_stat.py <<< 12755 1727204086.26616: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204086.178756-13462-30160059800920/AnsiballZ_stat.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpiec_gpev" to remote "/root/.ansible/tmp/ansible-tmp-1727204086.178756-13462-30160059800920/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204086.178756-13462-30160059800920/AnsiballZ_stat.py" <<< 12755 1727204086.28426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204086.28646: stderr chunk (state=3): >>><<< 12755 1727204086.28649: stdout chunk (state=3): >>><<< 12755 1727204086.28652: done transferring module to remote 12755 1727204086.28655: _low_level_execute_command(): starting 12755 1727204086.28657: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204086.178756-13462-30160059800920/ /root/.ansible/tmp/ansible-tmp-1727204086.178756-13462-30160059800920/AnsiballZ_stat.py && sleep 0' 12755 1727204086.29376: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204086.29395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 12755 1727204086.29409: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204086.29474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204086.29508: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204086.29550: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204086.31790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204086.31794: stdout chunk (state=3): >>><<< 12755 1727204086.31797: stderr chunk (state=3): >>><<< 12755 1727204086.31799: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204086.31802: _low_level_execute_command(): starting 12755 1727204086.31805: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204086.178756-13462-30160059800920/AnsiballZ_stat.py && sleep 0' 12755 1727204086.32570: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204086.32610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204086.32719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204086.32731: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204086.32759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204086.32845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204086.50527: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35772, "dev": 23, "nlink": 1, "atime": 1727204084.2334564, "mtime": 1727204084.2334564, "ctime": 1727204084.2334564, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12755 1727204086.52399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204086.52403: stdout chunk (state=3): >>><<< 12755 1727204086.52405: stderr chunk (state=3): >>><<< 12755 1727204086.52408: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35772, "dev": 23, "nlink": 1, "atime": 1727204084.2334564, "mtime": 1727204084.2334564, "ctime": 1727204084.2334564, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204086.52599: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204086.178756-13462-30160059800920/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204086.52603: _low_level_execute_command(): starting 12755 1727204086.52606: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204086.178756-13462-30160059800920/ > /dev/null 2>&1 && sleep 0' 12755 1727204086.53422: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204086.53464: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204086.53480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204086.53512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204086.53540: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204086.53557: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204086.53574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204086.53602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12755 1727204086.53628: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 12755 1727204086.53642: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12755 1727204086.53658: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204086.53674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204086.53709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204086.53796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204086.53883: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204086.54000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204086.56142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204086.56146: stdout chunk (state=3): >>><<< 12755 1727204086.56148: stderr chunk (state=3): >>><<< 12755 1727204086.56165: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204086.56343: handler run complete 12755 1727204086.56383: attempt loop complete, returning result 12755 1727204086.56427: _execute() done 12755 1727204086.56437: dumping result to json 12755 1727204086.56461: done dumping result, returning 12755 1727204086.56630: done running TaskExecutor() for managed-node1/TASK: Get stat for interface test2 [12b410aa-8751-72e9-1a19-000000000260] 12755 1727204086.56634: sending task result for task 12b410aa-8751-72e9-1a19-000000000260 12755 1727204086.56957: done sending task result for task 12b410aa-8751-72e9-1a19-000000000260 12755 1727204086.56961: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727204084.2334564, "block_size": 4096, "blocks": 0, "ctime": 1727204084.2334564, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 35772, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1727204084.2334564, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 12755 1727204086.57129: no more pending results, returning what we have 12755 1727204086.57134: results queue empty 12755 1727204086.57135: checking for any_errors_fatal 12755 1727204086.57196: done checking for any_errors_fatal 12755 1727204086.57198: checking for max_fail_percentage 12755 1727204086.57201: done checking for max_fail_percentage 12755 1727204086.57202: checking to see if all hosts have failed and the running result is not ok 12755 1727204086.57203: done checking to see if all hosts have failed 12755 1727204086.57204: getting the remaining hosts for this loop 12755 1727204086.57206: done getting the remaining hosts for this loop 12755 1727204086.57212: getting the next task for host managed-node1 12755 1727204086.57226: done getting next task for host managed-node1 12755 1727204086.57229: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 12755 1727204086.57233: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204086.57239: getting variables 12755 1727204086.57241: in VariableManager get_vars() 12755 1727204086.57605: Calling all_inventory to load vars for managed-node1 12755 1727204086.57609: Calling groups_inventory to load vars for managed-node1 12755 1727204086.57612: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204086.57632: Calling all_plugins_play to load vars for managed-node1 12755 1727204086.57637: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204086.57642: Calling groups_plugins_play to load vars for managed-node1 12755 1727204086.58332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204086.58978: done with get_vars() 12755 1727204086.58996: done getting variables 12755 1727204086.59075: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204086.59392: variable 'interface' from source: task vars 12755 1727204086.59396: variable 'dhcp_interface2' from source: play vars 12755 1727204086.59630: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.482) 0:00:11.832 ***** 12755 1727204086.59669: entering _queue_task() for managed-node1/assert 12755 1727204086.60251: worker is 1 (out of 1 available) 12755 1727204086.60264: exiting _queue_task() for managed-node1/assert 12755 1727204086.60276: done queuing things up, now waiting for results queue to drain 12755 1727204086.60277: waiting for pending results... 12755 1727204086.60837: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'test2' 12755 1727204086.61014: in run() - task 12b410aa-8751-72e9-1a19-00000000001c 12755 1727204086.61038: variable 'ansible_search_path' from source: unknown 12755 1727204086.61048: variable 'ansible_search_path' from source: unknown 12755 1727204086.61126: calling self._execute() 12755 1727204086.61215: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204086.61237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204086.61394: variable 'omit' from source: magic vars 12755 1727204086.61976: variable 'ansible_distribution_major_version' from source: facts 12755 1727204086.62000: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204086.62024: variable 'omit' from source: magic vars 12755 1727204086.62088: variable 'omit' from source: magic vars 12755 1727204086.62223: variable 'interface' from source: task vars 12755 1727204086.62242: variable 'dhcp_interface2' from source: play vars 12755 1727204086.62325: variable 'dhcp_interface2' from source: play vars 12755 1727204086.62362: variable 'omit' from source: magic vars 12755 1727204086.62416: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204086.62469: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204086.62564: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204086.62568: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204086.62571: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204086.62593: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204086.62603: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204086.62613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204086.62750: Set connection var ansible_connection to ssh 12755 1727204086.62765: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204086.62778: Set connection var ansible_shell_type to sh 12755 1727204086.62805: Set connection var ansible_timeout to 10 12755 1727204086.62819: Set connection var ansible_shell_executable to /bin/sh 12755 1727204086.62834: Set connection var ansible_pipelining to False 12755 1727204086.62891: variable 'ansible_shell_executable' from source: unknown 12755 1727204086.62897: variable 'ansible_connection' from source: unknown 12755 1727204086.62899: variable 'ansible_module_compression' from source: unknown 12755 1727204086.62902: variable 'ansible_shell_type' from source: unknown 12755 1727204086.62904: variable 'ansible_shell_executable' from source: unknown 12755 1727204086.62906: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204086.62966: variable 'ansible_pipelining' from source: unknown 12755 1727204086.62969: variable 'ansible_timeout' from source: unknown 12755 1727204086.62972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204086.63149: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204086.63176: variable 'omit' from source: magic vars 12755 1727204086.63187: starting attempt loop 12755 1727204086.63199: running the handler 12755 1727204086.63385: variable 'interface_stat' from source: set_fact 12755 1727204086.63426: Evaluated conditional (interface_stat.stat.exists): True 12755 1727204086.63496: handler run complete 12755 1727204086.63499: attempt loop complete, returning result 12755 1727204086.63501: _execute() done 12755 1727204086.63504: dumping result to json 12755 1727204086.63506: done dumping result, returning 12755 1727204086.63508: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'test2' [12b410aa-8751-72e9-1a19-00000000001c] 12755 1727204086.63510: sending task result for task 12b410aa-8751-72e9-1a19-00000000001c ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12755 1727204086.63699: no more pending results, returning what we have 12755 1727204086.63704: results queue empty 12755 1727204086.63705: checking for any_errors_fatal 12755 1727204086.63715: done checking for any_errors_fatal 12755 1727204086.63716: checking for max_fail_percentage 12755 1727204086.63718: done checking for max_fail_percentage 12755 1727204086.63719: checking to see if all hosts have failed and the running result is not ok 12755 1727204086.63720: done checking to see if all hosts have failed 12755 1727204086.63721: getting the remaining hosts for this loop 12755 1727204086.63723: done getting the remaining hosts for this loop 12755 1727204086.63729: getting the next task for host managed-node1 12755 1727204086.63739: done getting next task for host managed-node1 12755 1727204086.63742: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 12755 1727204086.63745: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204086.63749: getting variables 12755 1727204086.63751: in VariableManager get_vars() 12755 1727204086.63819: Calling all_inventory to load vars for managed-node1 12755 1727204086.63823: Calling groups_inventory to load vars for managed-node1 12755 1727204086.63826: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204086.63841: Calling all_plugins_play to load vars for managed-node1 12755 1727204086.63845: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204086.63849: Calling groups_plugins_play to load vars for managed-node1 12755 1727204086.64508: done sending task result for task 12b410aa-8751-72e9-1a19-00000000001c 12755 1727204086.64512: WORKER PROCESS EXITING 12755 1727204086.64546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204086.64898: done with get_vars() 12755 1727204086.64914: done getting variables 12755 1727204086.64993: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:28 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.053) 0:00:11.885 ***** 12755 1727204086.65027: entering _queue_task() for managed-node1/command 12755 1727204086.65425: worker is 1 (out of 1 available) 12755 1727204086.65438: exiting _queue_task() for managed-node1/command 12755 1727204086.65450: done queuing things up, now waiting for results queue to drain 12755 1727204086.65452: waiting for pending results... 12755 1727204086.65626: running TaskExecutor() for managed-node1/TASK: Backup the /etc/resolv.conf for initscript 12755 1727204086.65855: in run() - task 12b410aa-8751-72e9-1a19-00000000001d 12755 1727204086.65859: variable 'ansible_search_path' from source: unknown 12755 1727204086.65862: calling self._execute() 12755 1727204086.65928: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204086.65943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204086.65967: variable 'omit' from source: magic vars 12755 1727204086.66484: variable 'ansible_distribution_major_version' from source: facts 12755 1727204086.66509: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204086.66672: variable 'network_provider' from source: set_fact 12755 1727204086.66685: Evaluated conditional (network_provider == "initscripts"): False 12755 1727204086.66696: when evaluation is False, skipping this task 12755 1727204086.66704: _execute() done 12755 1727204086.66724: dumping result to json 12755 1727204086.66791: done dumping result, returning 12755 1727204086.66796: done running TaskExecutor() for managed-node1/TASK: Backup the /etc/resolv.conf for initscript [12b410aa-8751-72e9-1a19-00000000001d] 12755 1727204086.66799: sending task result for task 12b410aa-8751-72e9-1a19-00000000001d skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12755 1727204086.66996: no more pending results, returning what we have 12755 1727204086.67000: results queue empty 12755 1727204086.67002: checking for any_errors_fatal 12755 1727204086.67011: done checking for any_errors_fatal 12755 1727204086.67012: checking for max_fail_percentage 12755 1727204086.67014: done checking for max_fail_percentage 12755 1727204086.67015: checking to see if all hosts have failed and the running result is not ok 12755 1727204086.67016: done checking to see if all hosts have failed 12755 1727204086.67017: getting the remaining hosts for this loop 12755 1727204086.67019: done getting the remaining hosts for this loop 12755 1727204086.67024: getting the next task for host managed-node1 12755 1727204086.67031: done getting next task for host managed-node1 12755 1727204086.67034: ^ task is: TASK: TEST Add Bond with 2 ports 12755 1727204086.67038: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204086.67042: getting variables 12755 1727204086.67047: in VariableManager get_vars() 12755 1727204086.67114: Calling all_inventory to load vars for managed-node1 12755 1727204086.67118: Calling groups_inventory to load vars for managed-node1 12755 1727204086.67121: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204086.67138: Calling all_plugins_play to load vars for managed-node1 12755 1727204086.67142: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204086.67146: Calling groups_plugins_play to load vars for managed-node1 12755 1727204086.67554: done sending task result for task 12b410aa-8751-72e9-1a19-00000000001d 12755 1727204086.67558: WORKER PROCESS EXITING 12755 1727204086.67586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204086.67886: done with get_vars() 12755 1727204086.67901: done getting variables 12755 1727204086.68150: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports] ********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:33 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.031) 0:00:11.917 ***** 12755 1727204086.68187: entering _queue_task() for managed-node1/debug 12755 1727204086.68453: worker is 1 (out of 1 available) 12755 1727204086.68467: exiting _queue_task() for managed-node1/debug 12755 1727204086.68480: done queuing things up, now waiting for results queue to drain 12755 1727204086.68482: waiting for pending results... 12755 1727204086.68763: running TaskExecutor() for managed-node1/TASK: TEST Add Bond with 2 ports 12755 1727204086.68873: in run() - task 12b410aa-8751-72e9-1a19-00000000001e 12755 1727204086.68898: variable 'ansible_search_path' from source: unknown 12755 1727204086.68949: calling self._execute() 12755 1727204086.69059: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204086.69075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204086.69096: variable 'omit' from source: magic vars 12755 1727204086.69536: variable 'ansible_distribution_major_version' from source: facts 12755 1727204086.69558: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204086.69571: variable 'omit' from source: magic vars 12755 1727204086.69607: variable 'omit' from source: magic vars 12755 1727204086.69655: variable 'omit' from source: magic vars 12755 1727204086.69712: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204086.69807: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204086.69820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204086.69839: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204086.69863: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204086.69932: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204086.69941: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204086.69951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204086.70308: Set connection var ansible_connection to ssh 12755 1727204086.70314: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204086.70324: Set connection var ansible_shell_type to sh 12755 1727204086.70344: Set connection var ansible_timeout to 10 12755 1727204086.70362: Set connection var ansible_shell_executable to /bin/sh 12755 1727204086.70376: Set connection var ansible_pipelining to False 12755 1727204086.70409: variable 'ansible_shell_executable' from source: unknown 12755 1727204086.70474: variable 'ansible_connection' from source: unknown 12755 1727204086.70477: variable 'ansible_module_compression' from source: unknown 12755 1727204086.70479: variable 'ansible_shell_type' from source: unknown 12755 1727204086.70481: variable 'ansible_shell_executable' from source: unknown 12755 1727204086.70484: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204086.70486: variable 'ansible_pipelining' from source: unknown 12755 1727204086.70488: variable 'ansible_timeout' from source: unknown 12755 1727204086.70492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204086.70679: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204086.70706: variable 'omit' from source: magic vars 12755 1727204086.70717: starting attempt loop 12755 1727204086.70726: running the handler 12755 1727204086.70786: handler run complete 12755 1727204086.70853: attempt loop complete, returning result 12755 1727204086.70857: _execute() done 12755 1727204086.70859: dumping result to json 12755 1727204086.70862: done dumping result, returning 12755 1727204086.70865: done running TaskExecutor() for managed-node1/TASK: TEST Add Bond with 2 ports [12b410aa-8751-72e9-1a19-00000000001e] 12755 1727204086.70867: sending task result for task 12b410aa-8751-72e9-1a19-00000000001e ok: [managed-node1] => {} MSG: ################################################## 12755 1727204086.71138: no more pending results, returning what we have 12755 1727204086.71142: results queue empty 12755 1727204086.71144: checking for any_errors_fatal 12755 1727204086.71151: done checking for any_errors_fatal 12755 1727204086.71152: checking for max_fail_percentage 12755 1727204086.71154: done checking for max_fail_percentage 12755 1727204086.71156: checking to see if all hosts have failed and the running result is not ok 12755 1727204086.71157: done checking to see if all hosts have failed 12755 1727204086.71158: getting the remaining hosts for this loop 12755 1727204086.71159: done getting the remaining hosts for this loop 12755 1727204086.71164: getting the next task for host managed-node1 12755 1727204086.71178: done getting next task for host managed-node1 12755 1727204086.71185: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12755 1727204086.71188: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204086.71210: getting variables 12755 1727204086.71212: in VariableManager get_vars() 12755 1727204086.71272: Calling all_inventory to load vars for managed-node1 12755 1727204086.71276: Calling groups_inventory to load vars for managed-node1 12755 1727204086.71279: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204086.71400: done sending task result for task 12b410aa-8751-72e9-1a19-00000000001e 12755 1727204086.71404: WORKER PROCESS EXITING 12755 1727204086.71428: Calling all_plugins_play to load vars for managed-node1 12755 1727204086.71432: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204086.71437: Calling groups_plugins_play to load vars for managed-node1 12755 1727204086.72053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204086.72351: done with get_vars() 12755 1727204086.72362: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.042) 0:00:11.960 ***** 12755 1727204086.72474: entering _queue_task() for managed-node1/include_tasks 12755 1727204086.72817: worker is 1 (out of 1 available) 12755 1727204086.72829: exiting _queue_task() for managed-node1/include_tasks 12755 1727204086.72840: done queuing things up, now waiting for results queue to drain 12755 1727204086.72841: waiting for pending results... 12755 1727204086.73034: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12755 1727204086.73196: in run() - task 12b410aa-8751-72e9-1a19-000000000026 12755 1727204086.73217: variable 'ansible_search_path' from source: unknown 12755 1727204086.73227: variable 'ansible_search_path' from source: unknown 12755 1727204086.73272: calling self._execute() 12755 1727204086.73375: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204086.73393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204086.73411: variable 'omit' from source: magic vars 12755 1727204086.74083: variable 'ansible_distribution_major_version' from source: facts 12755 1727204086.74087: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204086.74092: _execute() done 12755 1727204086.74094: dumping result to json 12755 1727204086.74097: done dumping result, returning 12755 1727204086.74100: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-72e9-1a19-000000000026] 12755 1727204086.74102: sending task result for task 12b410aa-8751-72e9-1a19-000000000026 12755 1727204086.74269: no more pending results, returning what we have 12755 1727204086.74276: in VariableManager get_vars() 12755 1727204086.74416: Calling all_inventory to load vars for managed-node1 12755 1727204086.74419: Calling groups_inventory to load vars for managed-node1 12755 1727204086.74422: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204086.74497: Calling all_plugins_play to load vars for managed-node1 12755 1727204086.74502: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204086.74507: Calling groups_plugins_play to load vars for managed-node1 12755 1727204086.74770: done sending task result for task 12b410aa-8751-72e9-1a19-000000000026 12755 1727204086.74774: WORKER PROCESS EXITING 12755 1727204086.74803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204086.75087: done with get_vars() 12755 1727204086.75097: variable 'ansible_search_path' from source: unknown 12755 1727204086.75099: variable 'ansible_search_path' from source: unknown 12755 1727204086.75143: we have included files to process 12755 1727204086.75144: generating all_blocks data 12755 1727204086.75146: done generating all_blocks data 12755 1727204086.75151: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12755 1727204086.75153: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12755 1727204086.75155: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12755 1727204086.76126: done processing included file 12755 1727204086.76128: iterating over new_blocks loaded from include file 12755 1727204086.76130: in VariableManager get_vars() 12755 1727204086.76168: done with get_vars() 12755 1727204086.76170: filtering new block on tags 12755 1727204086.76198: done filtering new block on tags 12755 1727204086.76201: in VariableManager get_vars() 12755 1727204086.76237: done with get_vars() 12755 1727204086.76238: filtering new block on tags 12755 1727204086.76262: done filtering new block on tags 12755 1727204086.76265: in VariableManager get_vars() 12755 1727204086.76307: done with get_vars() 12755 1727204086.76309: filtering new block on tags 12755 1727204086.76333: done filtering new block on tags 12755 1727204086.76337: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 12755 1727204086.76342: extending task lists for all hosts with included blocks 12755 1727204086.77468: done extending task lists 12755 1727204086.77470: done processing included files 12755 1727204086.77471: results queue empty 12755 1727204086.77472: checking for any_errors_fatal 12755 1727204086.77475: done checking for any_errors_fatal 12755 1727204086.77476: checking for max_fail_percentage 12755 1727204086.77481: done checking for max_fail_percentage 12755 1727204086.77482: checking to see if all hosts have failed and the running result is not ok 12755 1727204086.77483: done checking to see if all hosts have failed 12755 1727204086.77484: getting the remaining hosts for this loop 12755 1727204086.77485: done getting the remaining hosts for this loop 12755 1727204086.77488: getting the next task for host managed-node1 12755 1727204086.77493: done getting next task for host managed-node1 12755 1727204086.77497: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12755 1727204086.77500: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204086.77511: getting variables 12755 1727204086.77512: in VariableManager get_vars() 12755 1727204086.77538: Calling all_inventory to load vars for managed-node1 12755 1727204086.77541: Calling groups_inventory to load vars for managed-node1 12755 1727204086.77543: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204086.77550: Calling all_plugins_play to load vars for managed-node1 12755 1727204086.77554: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204086.77558: Calling groups_plugins_play to load vars for managed-node1 12755 1727204086.77771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204086.78050: done with get_vars() 12755 1727204086.78061: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.056) 0:00:12.017 ***** 12755 1727204086.78147: entering _queue_task() for managed-node1/setup 12755 1727204086.78548: worker is 1 (out of 1 available) 12755 1727204086.78560: exiting _queue_task() for managed-node1/setup 12755 1727204086.78575: done queuing things up, now waiting for results queue to drain 12755 1727204086.78577: waiting for pending results... 12755 1727204086.78725: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12755 1727204086.78906: in run() - task 12b410aa-8751-72e9-1a19-00000000027e 12755 1727204086.78931: variable 'ansible_search_path' from source: unknown 12755 1727204086.78940: variable 'ansible_search_path' from source: unknown 12755 1727204086.78991: calling self._execute() 12755 1727204086.79100: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204086.79114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204086.79197: variable 'omit' from source: magic vars 12755 1727204086.79598: variable 'ansible_distribution_major_version' from source: facts 12755 1727204086.79619: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204086.79932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204086.82644: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204086.82745: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204086.82803: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204086.82849: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204086.82968: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204086.82996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204086.83043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204086.83085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204086.83152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204086.83177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204086.83260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204086.83299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204086.83347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204086.83406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204086.83492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204086.83654: variable '__network_required_facts' from source: role '' defaults 12755 1727204086.83674: variable 'ansible_facts' from source: unknown 12755 1727204086.83798: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 12755 1727204086.83880: when evaluation is False, skipping this task 12755 1727204086.83884: _execute() done 12755 1727204086.83887: dumping result to json 12755 1727204086.83892: done dumping result, returning 12755 1727204086.83895: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-72e9-1a19-00000000027e] 12755 1727204086.83897: sending task result for task 12b410aa-8751-72e9-1a19-00000000027e 12755 1727204086.83970: done sending task result for task 12b410aa-8751-72e9-1a19-00000000027e 12755 1727204086.83973: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204086.84025: no more pending results, returning what we have 12755 1727204086.84030: results queue empty 12755 1727204086.84031: checking for any_errors_fatal 12755 1727204086.84033: done checking for any_errors_fatal 12755 1727204086.84034: checking for max_fail_percentage 12755 1727204086.84035: done checking for max_fail_percentage 12755 1727204086.84036: checking to see if all hosts have failed and the running result is not ok 12755 1727204086.84038: done checking to see if all hosts have failed 12755 1727204086.84038: getting the remaining hosts for this loop 12755 1727204086.84040: done getting the remaining hosts for this loop 12755 1727204086.84045: getting the next task for host managed-node1 12755 1727204086.84056: done getting next task for host managed-node1 12755 1727204086.84060: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 12755 1727204086.84065: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204086.84081: getting variables 12755 1727204086.84083: in VariableManager get_vars() 12755 1727204086.84150: Calling all_inventory to load vars for managed-node1 12755 1727204086.84154: Calling groups_inventory to load vars for managed-node1 12755 1727204086.84157: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204086.84170: Calling all_plugins_play to load vars for managed-node1 12755 1727204086.84174: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204086.84179: Calling groups_plugins_play to load vars for managed-node1 12755 1727204086.84708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204086.85026: done with get_vars() 12755 1727204086.85040: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.070) 0:00:12.087 ***** 12755 1727204086.85171: entering _queue_task() for managed-node1/stat 12755 1727204086.85530: worker is 1 (out of 1 available) 12755 1727204086.85541: exiting _queue_task() for managed-node1/stat 12755 1727204086.85554: done queuing things up, now waiting for results queue to drain 12755 1727204086.85556: waiting for pending results... 12755 1727204086.85840: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 12755 1727204086.86046: in run() - task 12b410aa-8751-72e9-1a19-000000000280 12755 1727204086.86051: variable 'ansible_search_path' from source: unknown 12755 1727204086.86053: variable 'ansible_search_path' from source: unknown 12755 1727204086.86057: calling self._execute() 12755 1727204086.86107: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204086.86122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204086.86141: variable 'omit' from source: magic vars 12755 1727204086.86590: variable 'ansible_distribution_major_version' from source: facts 12755 1727204086.86616: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204086.86839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204086.87228: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204086.87298: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204086.87342: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204086.87394: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204086.87504: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204086.87544: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204086.87595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204086.87680: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204086.87753: variable '__network_is_ostree' from source: set_fact 12755 1727204086.87766: Evaluated conditional (not __network_is_ostree is defined): False 12755 1727204086.87774: when evaluation is False, skipping this task 12755 1727204086.87787: _execute() done 12755 1727204086.87801: dumping result to json 12755 1727204086.87810: done dumping result, returning 12755 1727204086.87824: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-72e9-1a19-000000000280] 12755 1727204086.87836: sending task result for task 12b410aa-8751-72e9-1a19-000000000280 skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12755 1727204086.88219: no more pending results, returning what we have 12755 1727204086.88223: results queue empty 12755 1727204086.88225: checking for any_errors_fatal 12755 1727204086.88231: done checking for any_errors_fatal 12755 1727204086.88232: checking for max_fail_percentage 12755 1727204086.88233: done checking for max_fail_percentage 12755 1727204086.88234: checking to see if all hosts have failed and the running result is not ok 12755 1727204086.88236: done checking to see if all hosts have failed 12755 1727204086.88237: getting the remaining hosts for this loop 12755 1727204086.88238: done getting the remaining hosts for this loop 12755 1727204086.88242: getting the next task for host managed-node1 12755 1727204086.88248: done getting next task for host managed-node1 12755 1727204086.88252: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12755 1727204086.88257: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204086.88271: getting variables 12755 1727204086.88273: in VariableManager get_vars() 12755 1727204086.88327: Calling all_inventory to load vars for managed-node1 12755 1727204086.88330: Calling groups_inventory to load vars for managed-node1 12755 1727204086.88334: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204086.88343: Calling all_plugins_play to load vars for managed-node1 12755 1727204086.88347: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204086.88351: Calling groups_plugins_play to load vars for managed-node1 12755 1727204086.88607: done sending task result for task 12b410aa-8751-72e9-1a19-000000000280 12755 1727204086.88615: WORKER PROCESS EXITING 12755 1727204086.88641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204086.88956: done with get_vars() 12755 1727204086.88967: done getting variables 12755 1727204086.89031: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.039) 0:00:12.126 ***** 12755 1727204086.89076: entering _queue_task() for managed-node1/set_fact 12755 1727204086.89332: worker is 1 (out of 1 available) 12755 1727204086.89346: exiting _queue_task() for managed-node1/set_fact 12755 1727204086.89359: done queuing things up, now waiting for results queue to drain 12755 1727204086.89361: waiting for pending results... 12755 1727204086.89649: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12755 1727204086.89929: in run() - task 12b410aa-8751-72e9-1a19-000000000281 12755 1727204086.89933: variable 'ansible_search_path' from source: unknown 12755 1727204086.89937: variable 'ansible_search_path' from source: unknown 12755 1727204086.89940: calling self._execute() 12755 1727204086.90016: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204086.90038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204086.90057: variable 'omit' from source: magic vars 12755 1727204086.90507: variable 'ansible_distribution_major_version' from source: facts 12755 1727204086.90527: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204086.90749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204086.91073: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204086.91142: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204086.91188: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204086.91245: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204086.91355: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204086.91452: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204086.91459: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204086.91479: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204086.91595: variable '__network_is_ostree' from source: set_fact 12755 1727204086.91609: Evaluated conditional (not __network_is_ostree is defined): False 12755 1727204086.91618: when evaluation is False, skipping this task 12755 1727204086.91626: _execute() done 12755 1727204086.91635: dumping result to json 12755 1727204086.91644: done dumping result, returning 12755 1727204086.91657: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-72e9-1a19-000000000281] 12755 1727204086.91675: sending task result for task 12b410aa-8751-72e9-1a19-000000000281 skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12755 1727204086.91840: no more pending results, returning what we have 12755 1727204086.91844: results queue empty 12755 1727204086.91846: checking for any_errors_fatal 12755 1727204086.91851: done checking for any_errors_fatal 12755 1727204086.91852: checking for max_fail_percentage 12755 1727204086.91854: done checking for max_fail_percentage 12755 1727204086.91855: checking to see if all hosts have failed and the running result is not ok 12755 1727204086.91857: done checking to see if all hosts have failed 12755 1727204086.91858: getting the remaining hosts for this loop 12755 1727204086.91860: done getting the remaining hosts for this loop 12755 1727204086.91865: getting the next task for host managed-node1 12755 1727204086.91876: done getting next task for host managed-node1 12755 1727204086.91880: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 12755 1727204086.91885: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204086.91909: getting variables 12755 1727204086.91911: in VariableManager get_vars() 12755 1727204086.91974: Calling all_inventory to load vars for managed-node1 12755 1727204086.91978: Calling groups_inventory to load vars for managed-node1 12755 1727204086.91981: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204086.92199: Calling all_plugins_play to load vars for managed-node1 12755 1727204086.92203: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204086.92209: Calling groups_plugins_play to load vars for managed-node1 12755 1727204086.92460: done sending task result for task 12b410aa-8751-72e9-1a19-000000000281 12755 1727204086.92464: WORKER PROCESS EXITING 12755 1727204086.92492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204086.92810: done with get_vars() 12755 1727204086.92821: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.038) 0:00:12.165 ***** 12755 1727204086.92936: entering _queue_task() for managed-node1/service_facts 12755 1727204086.92938: Creating lock for service_facts 12755 1727204086.93323: worker is 1 (out of 1 available) 12755 1727204086.93335: exiting _queue_task() for managed-node1/service_facts 12755 1727204086.93347: done queuing things up, now waiting for results queue to drain 12755 1727204086.93348: waiting for pending results... 12755 1727204086.93523: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 12755 1727204086.93717: in run() - task 12b410aa-8751-72e9-1a19-000000000283 12755 1727204086.93739: variable 'ansible_search_path' from source: unknown 12755 1727204086.93754: variable 'ansible_search_path' from source: unknown 12755 1727204086.93806: calling self._execute() 12755 1727204086.93914: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204086.93932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204086.93954: variable 'omit' from source: magic vars 12755 1727204086.94460: variable 'ansible_distribution_major_version' from source: facts 12755 1727204086.94480: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204086.94496: variable 'omit' from source: magic vars 12755 1727204086.94604: variable 'omit' from source: magic vars 12755 1727204086.94674: variable 'omit' from source: magic vars 12755 1727204086.94711: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204086.94781: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204086.94869: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204086.94873: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204086.94876: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204086.94907: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204086.94919: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204086.94937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204086.95081: Set connection var ansible_connection to ssh 12755 1727204086.95119: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204086.95128: Set connection var ansible_shell_type to sh 12755 1727204086.95151: Set connection var ansible_timeout to 10 12755 1727204086.95165: Set connection var ansible_shell_executable to /bin/sh 12755 1727204086.95220: Set connection var ansible_pipelining to False 12755 1727204086.95223: variable 'ansible_shell_executable' from source: unknown 12755 1727204086.95226: variable 'ansible_connection' from source: unknown 12755 1727204086.95231: variable 'ansible_module_compression' from source: unknown 12755 1727204086.95241: variable 'ansible_shell_type' from source: unknown 12755 1727204086.95249: variable 'ansible_shell_executable' from source: unknown 12755 1727204086.95257: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204086.95265: variable 'ansible_pipelining' from source: unknown 12755 1727204086.95271: variable 'ansible_timeout' from source: unknown 12755 1727204086.95328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204086.95527: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204086.95555: variable 'omit' from source: magic vars 12755 1727204086.95566: starting attempt loop 12755 1727204086.95575: running the handler 12755 1727204086.95599: _low_level_execute_command(): starting 12755 1727204086.95614: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204086.96434: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 12755 1727204086.96541: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204086.96587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204086.96733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204086.98459: stdout chunk (state=3): >>>/root <<< 12755 1727204086.98703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204086.98707: stdout chunk (state=3): >>><<< 12755 1727204086.98710: stderr chunk (state=3): >>><<< 12755 1727204086.98732: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204086.98760: _low_level_execute_command(): starting 12755 1727204086.98801: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204086.9873996-13585-177299925471779 `" && echo ansible-tmp-1727204086.9873996-13585-177299925471779="` echo /root/.ansible/tmp/ansible-tmp-1727204086.9873996-13585-177299925471779 `" ) && sleep 0' 12755 1727204087.00061: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204087.00126: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204087.00205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204087.00421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204087.00509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204087.02749: stdout chunk (state=3): >>>ansible-tmp-1727204086.9873996-13585-177299925471779=/root/.ansible/tmp/ansible-tmp-1727204086.9873996-13585-177299925471779 <<< 12755 1727204087.02956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204087.02961: stderr chunk (state=3): >>><<< 12755 1727204087.03021: stdout chunk (state=3): >>><<< 12755 1727204087.03041: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204086.9873996-13585-177299925471779=/root/.ansible/tmp/ansible-tmp-1727204086.9873996-13585-177299925471779 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204087.03111: variable 'ansible_module_compression' from source: unknown 12755 1727204087.03392: ANSIBALLZ: Using lock for service_facts 12755 1727204087.03398: ANSIBALLZ: Acquiring lock 12755 1727204087.03401: ANSIBALLZ: Lock acquired: 139630690854928 12755 1727204087.03403: ANSIBALLZ: Creating module 12755 1727204087.39304: ANSIBALLZ: Writing module into payload 12755 1727204087.39631: ANSIBALLZ: Writing module 12755 1727204087.39655: ANSIBALLZ: Renaming module 12755 1727204087.39662: ANSIBALLZ: Done creating module 12755 1727204087.39681: variable 'ansible_facts' from source: unknown 12755 1727204087.39761: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204086.9873996-13585-177299925471779/AnsiballZ_service_facts.py 12755 1727204087.40146: Sending initial data 12755 1727204087.40150: Sent initial data (162 bytes) 12755 1727204087.41821: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204087.41867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204087.41913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204087.43653: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204087.43813: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204087.43861: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpjkmz93r3 /root/.ansible/tmp/ansible-tmp-1727204086.9873996-13585-177299925471779/AnsiballZ_service_facts.py <<< 12755 1727204087.43865: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204086.9873996-13585-177299925471779/AnsiballZ_service_facts.py" <<< 12755 1727204087.43929: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpjkmz93r3" to remote "/root/.ansible/tmp/ansible-tmp-1727204086.9873996-13585-177299925471779/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204086.9873996-13585-177299925471779/AnsiballZ_service_facts.py" <<< 12755 1727204087.46445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204087.46456: stdout chunk (state=3): >>><<< 12755 1727204087.46510: stderr chunk (state=3): >>><<< 12755 1727204087.46548: done transferring module to remote 12755 1727204087.46740: _low_level_execute_command(): starting 12755 1727204087.46744: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204086.9873996-13585-177299925471779/ /root/.ansible/tmp/ansible-tmp-1727204086.9873996-13585-177299925471779/AnsiballZ_service_facts.py && sleep 0' 12755 1727204087.47969: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204087.48107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204087.48208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204087.48232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204087.48352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204087.50489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204087.50495: stdout chunk (state=3): >>><<< 12755 1727204087.50498: stderr chunk (state=3): >>><<< 12755 1727204087.50696: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204087.50702: _low_level_execute_command(): starting 12755 1727204087.50706: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204086.9873996-13585-177299925471779/AnsiballZ_service_facts.py && sleep 0' 12755 1727204087.52006: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204087.52025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204087.52107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204087.52192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204087.52420: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204087.52457: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204087.52518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204089.58131: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 12755 1727204089.60297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204089.60301: stdout chunk (state=3): >>><<< 12755 1727204089.60303: stderr chunk (state=3): >>><<< 12755 1727204089.60307: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204089.63597: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204086.9873996-13585-177299925471779/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204089.63709: _low_level_execute_command(): starting 12755 1727204089.63713: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204086.9873996-13585-177299925471779/ > /dev/null 2>&1 && sleep 0' 12755 1727204089.65412: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204089.65438: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204089.65465: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204089.65483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204089.65568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204089.67622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204089.67661: stderr chunk (state=3): >>><<< 12755 1727204089.67672: stdout chunk (state=3): >>><<< 12755 1727204089.67696: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204089.67729: handler run complete 12755 1727204089.68495: variable 'ansible_facts' from source: unknown 12755 1727204089.68978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204089.71408: variable 'ansible_facts' from source: unknown 12755 1727204089.74741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204089.75497: attempt loop complete, returning result 12755 1727204089.75540: _execute() done 12755 1727204089.75544: dumping result to json 12755 1727204089.75743: done dumping result, returning 12755 1727204089.75756: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-72e9-1a19-000000000283] 12755 1727204089.75759: sending task result for task 12b410aa-8751-72e9-1a19-000000000283 12755 1727204089.77642: done sending task result for task 12b410aa-8751-72e9-1a19-000000000283 12755 1727204089.77645: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204089.77762: no more pending results, returning what we have 12755 1727204089.77765: results queue empty 12755 1727204089.77766: checking for any_errors_fatal 12755 1727204089.77770: done checking for any_errors_fatal 12755 1727204089.77771: checking for max_fail_percentage 12755 1727204089.77772: done checking for max_fail_percentage 12755 1727204089.77773: checking to see if all hosts have failed and the running result is not ok 12755 1727204089.77774: done checking to see if all hosts have failed 12755 1727204089.77776: getting the remaining hosts for this loop 12755 1727204089.77778: done getting the remaining hosts for this loop 12755 1727204089.77782: getting the next task for host managed-node1 12755 1727204089.77792: done getting next task for host managed-node1 12755 1727204089.77796: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 12755 1727204089.77802: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204089.77902: getting variables 12755 1727204089.77905: in VariableManager get_vars() 12755 1727204089.78038: Calling all_inventory to load vars for managed-node1 12755 1727204089.78042: Calling groups_inventory to load vars for managed-node1 12755 1727204089.78045: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204089.78056: Calling all_plugins_play to load vars for managed-node1 12755 1727204089.78059: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204089.78064: Calling groups_plugins_play to load vars for managed-node1 12755 1727204089.79341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204089.80772: done with get_vars() 12755 1727204089.80791: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:54:49 -0400 (0:00:02.879) 0:00:15.044 ***** 12755 1727204089.80918: entering _queue_task() for managed-node1/package_facts 12755 1727204089.80921: Creating lock for package_facts 12755 1727204089.81237: worker is 1 (out of 1 available) 12755 1727204089.81252: exiting _queue_task() for managed-node1/package_facts 12755 1727204089.81266: done queuing things up, now waiting for results queue to drain 12755 1727204089.81268: waiting for pending results... 12755 1727204089.81576: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 12755 1727204089.81707: in run() - task 12b410aa-8751-72e9-1a19-000000000284 12755 1727204089.81765: variable 'ansible_search_path' from source: unknown 12755 1727204089.81769: variable 'ansible_search_path' from source: unknown 12755 1727204089.81817: calling self._execute() 12755 1727204089.81944: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204089.81951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204089.81995: variable 'omit' from source: magic vars 12755 1727204089.82394: variable 'ansible_distribution_major_version' from source: facts 12755 1727204089.82498: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204089.82503: variable 'omit' from source: magic vars 12755 1727204089.82513: variable 'omit' from source: magic vars 12755 1727204089.82565: variable 'omit' from source: magic vars 12755 1727204089.82609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204089.82651: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204089.82673: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204089.82696: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204089.82710: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204089.82754: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204089.82758: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204089.82761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204089.82893: Set connection var ansible_connection to ssh 12755 1727204089.82900: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204089.82903: Set connection var ansible_shell_type to sh 12755 1727204089.82921: Set connection var ansible_timeout to 10 12755 1727204089.82930: Set connection var ansible_shell_executable to /bin/sh 12755 1727204089.82937: Set connection var ansible_pipelining to False 12755 1727204089.82964: variable 'ansible_shell_executable' from source: unknown 12755 1727204089.82972: variable 'ansible_connection' from source: unknown 12755 1727204089.82975: variable 'ansible_module_compression' from source: unknown 12755 1727204089.82978: variable 'ansible_shell_type' from source: unknown 12755 1727204089.82980: variable 'ansible_shell_executable' from source: unknown 12755 1727204089.82983: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204089.82986: variable 'ansible_pipelining' from source: unknown 12755 1727204089.83017: variable 'ansible_timeout' from source: unknown 12755 1727204089.83020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204089.83322: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204089.83327: variable 'omit' from source: magic vars 12755 1727204089.83330: starting attempt loop 12755 1727204089.83332: running the handler 12755 1727204089.83335: _low_level_execute_command(): starting 12755 1727204089.83337: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204089.84671: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204089.85117: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204089.85211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204089.85284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204089.87073: stdout chunk (state=3): >>>/root <<< 12755 1727204089.87186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204089.87243: stderr chunk (state=3): >>><<< 12755 1727204089.87246: stdout chunk (state=3): >>><<< 12755 1727204089.87259: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204089.87339: _low_level_execute_command(): starting 12755 1727204089.87344: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204089.8726425-13735-114662930589288 `" && echo ansible-tmp-1727204089.8726425-13735-114662930589288="` echo /root/.ansible/tmp/ansible-tmp-1727204089.8726425-13735-114662930589288 `" ) && sleep 0' 12755 1727204089.87871: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204089.88007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204089.90079: stdout chunk (state=3): >>>ansible-tmp-1727204089.8726425-13735-114662930589288=/root/.ansible/tmp/ansible-tmp-1727204089.8726425-13735-114662930589288 <<< 12755 1727204089.90234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204089.90238: stdout chunk (state=3): >>><<< 12755 1727204089.90244: stderr chunk (state=3): >>><<< 12755 1727204089.90261: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204089.8726425-13735-114662930589288=/root/.ansible/tmp/ansible-tmp-1727204089.8726425-13735-114662930589288 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204089.90305: variable 'ansible_module_compression' from source: unknown 12755 1727204089.90353: ANSIBALLZ: Using lock for package_facts 12755 1727204089.90367: ANSIBALLZ: Acquiring lock 12755 1727204089.90370: ANSIBALLZ: Lock acquired: 139630692300368 12755 1727204089.90372: ANSIBALLZ: Creating module 12755 1727204090.21299: ANSIBALLZ: Writing module into payload 12755 1727204090.21446: ANSIBALLZ: Writing module 12755 1727204090.21495: ANSIBALLZ: Renaming module 12755 1727204090.21512: ANSIBALLZ: Done creating module 12755 1727204090.21564: variable 'ansible_facts' from source: unknown 12755 1727204090.21797: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204089.8726425-13735-114662930589288/AnsiballZ_package_facts.py 12755 1727204090.22022: Sending initial data 12755 1727204090.22025: Sent initial data (162 bytes) 12755 1727204090.22695: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204090.22862: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204090.23064: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204090.23097: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204090.23118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204090.23429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204090.25171: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12755 1727204090.25192: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 12755 1727204090.25212: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204090.25370: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204090.25374: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmptcqa0hq9 /root/.ansible/tmp/ansible-tmp-1727204089.8726425-13735-114662930589288/AnsiballZ_package_facts.py <<< 12755 1727204090.25376: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204089.8726425-13735-114662930589288/AnsiballZ_package_facts.py" <<< 12755 1727204090.25507: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmptcqa0hq9" to remote "/root/.ansible/tmp/ansible-tmp-1727204089.8726425-13735-114662930589288/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204089.8726425-13735-114662930589288/AnsiballZ_package_facts.py" <<< 12755 1727204090.31120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204090.31124: stderr chunk (state=3): >>><<< 12755 1727204090.31127: stdout chunk (state=3): >>><<< 12755 1727204090.31129: done transferring module to remote 12755 1727204090.31132: _low_level_execute_command(): starting 12755 1727204090.31134: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204089.8726425-13735-114662930589288/ /root/.ansible/tmp/ansible-tmp-1727204089.8726425-13735-114662930589288/AnsiballZ_package_facts.py && sleep 0' 12755 1727204090.32409: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204090.32519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204090.32530: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204090.32647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204090.32679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204090.34787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204090.34797: stdout chunk (state=3): >>><<< 12755 1727204090.34806: stderr chunk (state=3): >>><<< 12755 1727204090.34924: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204090.34929: _low_level_execute_command(): starting 12755 1727204090.34932: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204089.8726425-13735-114662930589288/AnsiballZ_package_facts.py && sleep 0' 12755 1727204090.35414: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204090.35462: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204090.35470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204090.35474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204090.35479: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204090.35485: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204090.35487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204090.35503: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12755 1727204090.35512: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 12755 1727204090.35571: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12755 1727204090.35579: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204090.35582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204090.35585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204090.35587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204090.35594: stderr chunk (state=3): >>>debug2: match found <<< 12755 1727204090.35596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204090.35648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204090.35667: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204090.35679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204090.35788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204090.99686: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 12755 1727204090.99709: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 12755 1727204090.99738: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-<<< 12755 1727204090.99760: stdout chunk (state=3): >>>libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 12755 1727204090.99786: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils",<<< 12755 1727204090.99815: stdout chunk (state=3): >>> "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb",<<< 12755 1727204090.99828: stdout chunk (state=3): >>> "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.<<< 12755 1727204090.99861: stdout chunk (state=3): >>>fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": n<<< 12755 1727204090.99869: stdout chunk (state=3): >>>ull, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": <<< 12755 1727204090.99875: stdout chunk (state=3): >>>"perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", <<< 12755 1727204090.99898: stdout chunk (state=3): >>>"source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name"<<< 12755 1727204090.99911: stdout chunk (state=3): >>>: "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch<<< 12755 1727204090.99929: stdout chunk (state=3): >>>": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "so<<< 12755 1727204090.99956: stdout chunk (state=3): >>>urce": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifa<<< 12755 1727204090.99964: stdout chunk (state=3): >>>ces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 12755 1727204091.01906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204091.01973: stderr chunk (state=3): >>><<< 12755 1727204091.01977: stdout chunk (state=3): >>><<< 12755 1727204091.02027: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204091.04410: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204089.8726425-13735-114662930589288/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204091.04436: _low_level_execute_command(): starting 12755 1727204091.04439: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204089.8726425-13735-114662930589288/ > /dev/null 2>&1 && sleep 0' 12755 1727204091.04953: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204091.04956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204091.04959: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204091.04962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204091.05014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204091.05020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204091.05033: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204091.05099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204091.07166: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204091.07169: stdout chunk (state=3): >>><<< 12755 1727204091.07172: stderr chunk (state=3): >>><<< 12755 1727204091.07187: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204091.07204: handler run complete 12755 1727204091.08331: variable 'ansible_facts' from source: unknown 12755 1727204091.08759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204091.10763: variable 'ansible_facts' from source: unknown 12755 1727204091.15547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204091.16594: attempt loop complete, returning result 12755 1727204091.16621: _execute() done 12755 1727204091.16626: dumping result to json 12755 1727204091.16909: done dumping result, returning 12755 1727204091.16921: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-72e9-1a19-000000000284] 12755 1727204091.16924: sending task result for task 12b410aa-8751-72e9-1a19-000000000284 12755 1727204091.19557: done sending task result for task 12b410aa-8751-72e9-1a19-000000000284 12755 1727204091.19563: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204091.19644: no more pending results, returning what we have 12755 1727204091.19647: results queue empty 12755 1727204091.19648: checking for any_errors_fatal 12755 1727204091.19653: done checking for any_errors_fatal 12755 1727204091.19654: checking for max_fail_percentage 12755 1727204091.19655: done checking for max_fail_percentage 12755 1727204091.19656: checking to see if all hosts have failed and the running result is not ok 12755 1727204091.19656: done checking to see if all hosts have failed 12755 1727204091.19657: getting the remaining hosts for this loop 12755 1727204091.19658: done getting the remaining hosts for this loop 12755 1727204091.19661: getting the next task for host managed-node1 12755 1727204091.19666: done getting next task for host managed-node1 12755 1727204091.19668: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12755 1727204091.19671: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204091.19680: getting variables 12755 1727204091.19681: in VariableManager get_vars() 12755 1727204091.19720: Calling all_inventory to load vars for managed-node1 12755 1727204091.19723: Calling groups_inventory to load vars for managed-node1 12755 1727204091.19725: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204091.19733: Calling all_plugins_play to load vars for managed-node1 12755 1727204091.19735: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204091.19737: Calling groups_plugins_play to load vars for managed-node1 12755 1727204091.21029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204091.22675: done with get_vars() 12755 1727204091.22700: done getting variables 12755 1727204091.22777: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:54:51 -0400 (0:00:01.418) 0:00:16.463 ***** 12755 1727204091.22807: entering _queue_task() for managed-node1/debug 12755 1727204091.23062: worker is 1 (out of 1 available) 12755 1727204091.23078: exiting _queue_task() for managed-node1/debug 12755 1727204091.23092: done queuing things up, now waiting for results queue to drain 12755 1727204091.23094: waiting for pending results... 12755 1727204091.23284: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 12755 1727204091.23387: in run() - task 12b410aa-8751-72e9-1a19-000000000027 12755 1727204091.23403: variable 'ansible_search_path' from source: unknown 12755 1727204091.23407: variable 'ansible_search_path' from source: unknown 12755 1727204091.23447: calling self._execute() 12755 1727204091.23526: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204091.23536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204091.23553: variable 'omit' from source: magic vars 12755 1727204091.23870: variable 'ansible_distribution_major_version' from source: facts 12755 1727204091.23887: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204091.23892: variable 'omit' from source: magic vars 12755 1727204091.23938: variable 'omit' from source: magic vars 12755 1727204091.24025: variable 'network_provider' from source: set_fact 12755 1727204091.24040: variable 'omit' from source: magic vars 12755 1727204091.24076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204091.24111: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204091.24133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204091.24149: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204091.24161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204091.24191: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204091.24194: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204091.24197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204091.24284: Set connection var ansible_connection to ssh 12755 1727204091.24292: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204091.24295: Set connection var ansible_shell_type to sh 12755 1727204091.24306: Set connection var ansible_timeout to 10 12755 1727204091.24317: Set connection var ansible_shell_executable to /bin/sh 12755 1727204091.24328: Set connection var ansible_pipelining to False 12755 1727204091.24344: variable 'ansible_shell_executable' from source: unknown 12755 1727204091.24349: variable 'ansible_connection' from source: unknown 12755 1727204091.24351: variable 'ansible_module_compression' from source: unknown 12755 1727204091.24356: variable 'ansible_shell_type' from source: unknown 12755 1727204091.24358: variable 'ansible_shell_executable' from source: unknown 12755 1727204091.24363: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204091.24368: variable 'ansible_pipelining' from source: unknown 12755 1727204091.24372: variable 'ansible_timeout' from source: unknown 12755 1727204091.24377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204091.24500: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204091.24510: variable 'omit' from source: magic vars 12755 1727204091.24515: starting attempt loop 12755 1727204091.24522: running the handler 12755 1727204091.24566: handler run complete 12755 1727204091.24579: attempt loop complete, returning result 12755 1727204091.24583: _execute() done 12755 1727204091.24586: dumping result to json 12755 1727204091.24592: done dumping result, returning 12755 1727204091.24601: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-72e9-1a19-000000000027] 12755 1727204091.24605: sending task result for task 12b410aa-8751-72e9-1a19-000000000027 12755 1727204091.24699: done sending task result for task 12b410aa-8751-72e9-1a19-000000000027 12755 1727204091.24702: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: Using network provider: nm 12755 1727204091.24765: no more pending results, returning what we have 12755 1727204091.24768: results queue empty 12755 1727204091.24769: checking for any_errors_fatal 12755 1727204091.24781: done checking for any_errors_fatal 12755 1727204091.24782: checking for max_fail_percentage 12755 1727204091.24784: done checking for max_fail_percentage 12755 1727204091.24785: checking to see if all hosts have failed and the running result is not ok 12755 1727204091.24786: done checking to see if all hosts have failed 12755 1727204091.24787: getting the remaining hosts for this loop 12755 1727204091.24788: done getting the remaining hosts for this loop 12755 1727204091.24803: getting the next task for host managed-node1 12755 1727204091.24811: done getting next task for host managed-node1 12755 1727204091.24815: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12755 1727204091.24818: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204091.24830: getting variables 12755 1727204091.24832: in VariableManager get_vars() 12755 1727204091.24883: Calling all_inventory to load vars for managed-node1 12755 1727204091.24885: Calling groups_inventory to load vars for managed-node1 12755 1727204091.24888: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204091.24918: Calling all_plugins_play to load vars for managed-node1 12755 1727204091.24928: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204091.24933: Calling groups_plugins_play to load vars for managed-node1 12755 1727204091.26857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204091.29048: done with get_vars() 12755 1727204091.29101: done getting variables 12755 1727204091.29221: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:54:51 -0400 (0:00:00.064) 0:00:16.528 ***** 12755 1727204091.29264: entering _queue_task() for managed-node1/fail 12755 1727204091.29266: Creating lock for fail 12755 1727204091.29580: worker is 1 (out of 1 available) 12755 1727204091.29597: exiting _queue_task() for managed-node1/fail 12755 1727204091.29611: done queuing things up, now waiting for results queue to drain 12755 1727204091.29612: waiting for pending results... 12755 1727204091.29979: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12755 1727204091.30018: in run() - task 12b410aa-8751-72e9-1a19-000000000028 12755 1727204091.30030: variable 'ansible_search_path' from source: unknown 12755 1727204091.30034: variable 'ansible_search_path' from source: unknown 12755 1727204091.30068: calling self._execute() 12755 1727204091.30167: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204091.30173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204091.30184: variable 'omit' from source: magic vars 12755 1727204091.30565: variable 'ansible_distribution_major_version' from source: facts 12755 1727204091.30577: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204091.30685: variable 'network_state' from source: role '' defaults 12755 1727204091.30706: Evaluated conditional (network_state != {}): False 12755 1727204091.30710: when evaluation is False, skipping this task 12755 1727204091.30713: _execute() done 12755 1727204091.30716: dumping result to json 12755 1727204091.30723: done dumping result, returning 12755 1727204091.30731: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-72e9-1a19-000000000028] 12755 1727204091.30739: sending task result for task 12b410aa-8751-72e9-1a19-000000000028 12755 1727204091.30843: done sending task result for task 12b410aa-8751-72e9-1a19-000000000028 12755 1727204091.30846: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204091.30917: no more pending results, returning what we have 12755 1727204091.30924: results queue empty 12755 1727204091.30925: checking for any_errors_fatal 12755 1727204091.30931: done checking for any_errors_fatal 12755 1727204091.30932: checking for max_fail_percentage 12755 1727204091.30934: done checking for max_fail_percentage 12755 1727204091.30935: checking to see if all hosts have failed and the running result is not ok 12755 1727204091.30936: done checking to see if all hosts have failed 12755 1727204091.30937: getting the remaining hosts for this loop 12755 1727204091.30939: done getting the remaining hosts for this loop 12755 1727204091.30943: getting the next task for host managed-node1 12755 1727204091.30951: done getting next task for host managed-node1 12755 1727204091.30956: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12755 1727204091.30960: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204091.30975: getting variables 12755 1727204091.30977: in VariableManager get_vars() 12755 1727204091.31028: Calling all_inventory to load vars for managed-node1 12755 1727204091.31031: Calling groups_inventory to load vars for managed-node1 12755 1727204091.31033: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204091.31043: Calling all_plugins_play to load vars for managed-node1 12755 1727204091.31046: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204091.31050: Calling groups_plugins_play to load vars for managed-node1 12755 1727204091.32403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204091.34244: done with get_vars() 12755 1727204091.34268: done getting variables 12755 1727204091.34322: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:54:51 -0400 (0:00:00.050) 0:00:16.579 ***** 12755 1727204091.34349: entering _queue_task() for managed-node1/fail 12755 1727204091.34614: worker is 1 (out of 1 available) 12755 1727204091.34630: exiting _queue_task() for managed-node1/fail 12755 1727204091.34644: done queuing things up, now waiting for results queue to drain 12755 1727204091.34646: waiting for pending results... 12755 1727204091.34878: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12755 1727204091.34982: in run() - task 12b410aa-8751-72e9-1a19-000000000029 12755 1727204091.34999: variable 'ansible_search_path' from source: unknown 12755 1727204091.35005: variable 'ansible_search_path' from source: unknown 12755 1727204091.35038: calling self._execute() 12755 1727204091.35114: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204091.35127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204091.35136: variable 'omit' from source: magic vars 12755 1727204091.35457: variable 'ansible_distribution_major_version' from source: facts 12755 1727204091.35468: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204091.35576: variable 'network_state' from source: role '' defaults 12755 1727204091.35586: Evaluated conditional (network_state != {}): False 12755 1727204091.35591: when evaluation is False, skipping this task 12755 1727204091.35594: _execute() done 12755 1727204091.35597: dumping result to json 12755 1727204091.35603: done dumping result, returning 12755 1727204091.35610: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-72e9-1a19-000000000029] 12755 1727204091.35616: sending task result for task 12b410aa-8751-72e9-1a19-000000000029 12755 1727204091.35714: done sending task result for task 12b410aa-8751-72e9-1a19-000000000029 12755 1727204091.35717: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204091.35768: no more pending results, returning what we have 12755 1727204091.35771: results queue empty 12755 1727204091.35773: checking for any_errors_fatal 12755 1727204091.35787: done checking for any_errors_fatal 12755 1727204091.35788: checking for max_fail_percentage 12755 1727204091.35792: done checking for max_fail_percentage 12755 1727204091.35793: checking to see if all hosts have failed and the running result is not ok 12755 1727204091.35794: done checking to see if all hosts have failed 12755 1727204091.35795: getting the remaining hosts for this loop 12755 1727204091.35797: done getting the remaining hosts for this loop 12755 1727204091.35801: getting the next task for host managed-node1 12755 1727204091.35808: done getting next task for host managed-node1 12755 1727204091.35812: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12755 1727204091.35815: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204091.35831: getting variables 12755 1727204091.35833: in VariableManager get_vars() 12755 1727204091.35882: Calling all_inventory to load vars for managed-node1 12755 1727204091.35885: Calling groups_inventory to load vars for managed-node1 12755 1727204091.35888: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204091.35935: Calling all_plugins_play to load vars for managed-node1 12755 1727204091.35939: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204091.35944: Calling groups_plugins_play to load vars for managed-node1 12755 1727204091.37296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204091.39287: done with get_vars() 12755 1727204091.39319: done getting variables 12755 1727204091.39370: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:54:51 -0400 (0:00:00.050) 0:00:16.629 ***** 12755 1727204091.39399: entering _queue_task() for managed-node1/fail 12755 1727204091.39718: worker is 1 (out of 1 available) 12755 1727204091.39735: exiting _queue_task() for managed-node1/fail 12755 1727204091.39748: done queuing things up, now waiting for results queue to drain 12755 1727204091.39749: waiting for pending results... 12755 1727204091.39975: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12755 1727204091.40186: in run() - task 12b410aa-8751-72e9-1a19-00000000002a 12755 1727204091.40193: variable 'ansible_search_path' from source: unknown 12755 1727204091.40196: variable 'ansible_search_path' from source: unknown 12755 1727204091.40230: calling self._execute() 12755 1727204091.40387: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204091.40393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204091.40396: variable 'omit' from source: magic vars 12755 1727204091.40785: variable 'ansible_distribution_major_version' from source: facts 12755 1727204091.40791: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204091.40958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204091.43272: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204091.43339: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204091.43371: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204091.43408: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204091.43511: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204091.43549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204091.43575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204091.43604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204091.43640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204091.43652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204091.43744: variable 'ansible_distribution_major_version' from source: facts 12755 1727204091.43757: Evaluated conditional (ansible_distribution_major_version | int > 9): True 12755 1727204091.43861: variable 'ansible_distribution' from source: facts 12755 1727204091.43865: variable '__network_rh_distros' from source: role '' defaults 12755 1727204091.43873: Evaluated conditional (ansible_distribution in __network_rh_distros): False 12755 1727204091.43877: when evaluation is False, skipping this task 12755 1727204091.43881: _execute() done 12755 1727204091.43886: dumping result to json 12755 1727204091.43891: done dumping result, returning 12755 1727204091.43901: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-72e9-1a19-00000000002a] 12755 1727204091.43907: sending task result for task 12b410aa-8751-72e9-1a19-00000000002a 12755 1727204091.44015: done sending task result for task 12b410aa-8751-72e9-1a19-00000000002a 12755 1727204091.44023: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 12755 1727204091.44094: no more pending results, returning what we have 12755 1727204091.44099: results queue empty 12755 1727204091.44105: checking for any_errors_fatal 12755 1727204091.44111: done checking for any_errors_fatal 12755 1727204091.44112: checking for max_fail_percentage 12755 1727204091.44114: done checking for max_fail_percentage 12755 1727204091.44115: checking to see if all hosts have failed and the running result is not ok 12755 1727204091.44118: done checking to see if all hosts have failed 12755 1727204091.44120: getting the remaining hosts for this loop 12755 1727204091.44122: done getting the remaining hosts for this loop 12755 1727204091.44132: getting the next task for host managed-node1 12755 1727204091.44141: done getting next task for host managed-node1 12755 1727204091.44145: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12755 1727204091.44155: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204091.44180: getting variables 12755 1727204091.44183: in VariableManager get_vars() 12755 1727204091.44286: Calling all_inventory to load vars for managed-node1 12755 1727204091.44292: Calling groups_inventory to load vars for managed-node1 12755 1727204091.44298: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204091.44323: Calling all_plugins_play to load vars for managed-node1 12755 1727204091.44330: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204091.44335: Calling groups_plugins_play to load vars for managed-node1 12755 1727204091.46496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204091.49071: done with get_vars() 12755 1727204091.49115: done getting variables 12755 1727204091.49236: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:54:51 -0400 (0:00:00.098) 0:00:16.728 ***** 12755 1727204091.49278: entering _queue_task() for managed-node1/dnf 12755 1727204091.49538: worker is 1 (out of 1 available) 12755 1727204091.49553: exiting _queue_task() for managed-node1/dnf 12755 1727204091.49568: done queuing things up, now waiting for results queue to drain 12755 1727204091.49569: waiting for pending results... 12755 1727204091.49776: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12755 1727204091.49881: in run() - task 12b410aa-8751-72e9-1a19-00000000002b 12755 1727204091.49897: variable 'ansible_search_path' from source: unknown 12755 1727204091.49903: variable 'ansible_search_path' from source: unknown 12755 1727204091.49943: calling self._execute() 12755 1727204091.50022: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204091.50030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204091.50041: variable 'omit' from source: magic vars 12755 1727204091.50360: variable 'ansible_distribution_major_version' from source: facts 12755 1727204091.50373: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204091.50550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204091.52578: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204091.53195: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204091.53199: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204091.53202: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204091.53204: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204091.53239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204091.53281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204091.53319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204091.53375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204091.53400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204091.53540: variable 'ansible_distribution' from source: facts 12755 1727204091.53552: variable 'ansible_distribution_major_version' from source: facts 12755 1727204091.53566: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 12755 1727204091.53709: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204091.53879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204091.53915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204091.53947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204091.54106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204091.54146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204091.54208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204091.54262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204091.54301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204091.54374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204091.54404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204091.54481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204091.54521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204091.54573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204091.54636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204091.54670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204091.54930: variable 'network_connections' from source: task vars 12755 1727204091.54949: variable 'controller_profile' from source: play vars 12755 1727204091.55049: variable 'controller_profile' from source: play vars 12755 1727204091.55100: variable 'controller_device' from source: play vars 12755 1727204091.55166: variable 'controller_device' from source: play vars 12755 1727204091.55183: variable 'port1_profile' from source: play vars 12755 1727204091.55281: variable 'port1_profile' from source: play vars 12755 1727204091.55320: variable 'dhcp_interface1' from source: play vars 12755 1727204091.55397: variable 'dhcp_interface1' from source: play vars 12755 1727204091.55427: variable 'controller_profile' from source: play vars 12755 1727204091.55508: variable 'controller_profile' from source: play vars 12755 1727204091.55535: variable 'port2_profile' from source: play vars 12755 1727204091.55645: variable 'port2_profile' from source: play vars 12755 1727204091.55649: variable 'dhcp_interface2' from source: play vars 12755 1727204091.55726: variable 'dhcp_interface2' from source: play vars 12755 1727204091.55739: variable 'controller_profile' from source: play vars 12755 1727204091.55831: variable 'controller_profile' from source: play vars 12755 1727204091.55972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204091.56198: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204091.56251: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204091.56314: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204091.56347: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204091.56423: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204091.56452: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204091.56517: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204091.56541: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204091.56622: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204091.56997: variable 'network_connections' from source: task vars 12755 1727204091.57062: variable 'controller_profile' from source: play vars 12755 1727204091.57104: variable 'controller_profile' from source: play vars 12755 1727204091.57118: variable 'controller_device' from source: play vars 12755 1727204091.57209: variable 'controller_device' from source: play vars 12755 1727204091.57226: variable 'port1_profile' from source: play vars 12755 1727204091.57318: variable 'port1_profile' from source: play vars 12755 1727204091.57333: variable 'dhcp_interface1' from source: play vars 12755 1727204091.57420: variable 'dhcp_interface1' from source: play vars 12755 1727204091.57432: variable 'controller_profile' from source: play vars 12755 1727204091.57519: variable 'controller_profile' from source: play vars 12755 1727204091.57595: variable 'port2_profile' from source: play vars 12755 1727204091.57603: variable 'port2_profile' from source: play vars 12755 1727204091.57627: variable 'dhcp_interface2' from source: play vars 12755 1727204091.57720: variable 'dhcp_interface2' from source: play vars 12755 1727204091.57726: variable 'controller_profile' from source: play vars 12755 1727204091.57812: variable 'controller_profile' from source: play vars 12755 1727204091.57938: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12755 1727204091.57941: when evaluation is False, skipping this task 12755 1727204091.57944: _execute() done 12755 1727204091.57946: dumping result to json 12755 1727204091.57949: done dumping result, returning 12755 1727204091.57953: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-72e9-1a19-00000000002b] 12755 1727204091.57956: sending task result for task 12b410aa-8751-72e9-1a19-00000000002b skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12755 1727204091.58092: no more pending results, returning what we have 12755 1727204091.58096: results queue empty 12755 1727204091.58097: checking for any_errors_fatal 12755 1727204091.58106: done checking for any_errors_fatal 12755 1727204091.58107: checking for max_fail_percentage 12755 1727204091.58109: done checking for max_fail_percentage 12755 1727204091.58110: checking to see if all hosts have failed and the running result is not ok 12755 1727204091.58111: done checking to see if all hosts have failed 12755 1727204091.58112: getting the remaining hosts for this loop 12755 1727204091.58114: done getting the remaining hosts for this loop 12755 1727204091.58120: getting the next task for host managed-node1 12755 1727204091.58129: done getting next task for host managed-node1 12755 1727204091.58134: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12755 1727204091.58138: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204091.58155: getting variables 12755 1727204091.58157: in VariableManager get_vars() 12755 1727204091.58340: Calling all_inventory to load vars for managed-node1 12755 1727204091.58344: Calling groups_inventory to load vars for managed-node1 12755 1727204091.58347: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204091.58361: Calling all_plugins_play to load vars for managed-node1 12755 1727204091.58365: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204091.58368: Calling groups_plugins_play to load vars for managed-node1 12755 1727204091.59039: done sending task result for task 12b410aa-8751-72e9-1a19-00000000002b 12755 1727204091.59043: WORKER PROCESS EXITING 12755 1727204091.61185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204091.64281: done with get_vars() 12755 1727204091.64328: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12755 1727204091.64431: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:54:51 -0400 (0:00:00.151) 0:00:16.880 ***** 12755 1727204091.64470: entering _queue_task() for managed-node1/yum 12755 1727204091.64472: Creating lock for yum 12755 1727204091.64892: worker is 1 (out of 1 available) 12755 1727204091.64908: exiting _queue_task() for managed-node1/yum 12755 1727204091.64920: done queuing things up, now waiting for results queue to drain 12755 1727204091.64922: waiting for pending results... 12755 1727204091.65212: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12755 1727204091.65376: in run() - task 12b410aa-8751-72e9-1a19-00000000002c 12755 1727204091.65412: variable 'ansible_search_path' from source: unknown 12755 1727204091.65423: variable 'ansible_search_path' from source: unknown 12755 1727204091.65471: calling self._execute() 12755 1727204091.65586: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204091.65605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204091.65634: variable 'omit' from source: magic vars 12755 1727204091.66102: variable 'ansible_distribution_major_version' from source: facts 12755 1727204091.66122: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204091.66368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204091.69193: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204091.69307: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204091.69370: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204091.69437: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204091.69495: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204091.69591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204091.69635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204091.69755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204091.69760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204091.69779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204091.69919: variable 'ansible_distribution_major_version' from source: facts 12755 1727204091.69944: Evaluated conditional (ansible_distribution_major_version | int < 8): False 12755 1727204091.69955: when evaluation is False, skipping this task 12755 1727204091.69966: _execute() done 12755 1727204091.69992: dumping result to json 12755 1727204091.69997: done dumping result, returning 12755 1727204091.70080: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-72e9-1a19-00000000002c] 12755 1727204091.70084: sending task result for task 12b410aa-8751-72e9-1a19-00000000002c 12755 1727204091.70170: done sending task result for task 12b410aa-8751-72e9-1a19-00000000002c 12755 1727204091.70174: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 12755 1727204091.70241: no more pending results, returning what we have 12755 1727204091.70246: results queue empty 12755 1727204091.70247: checking for any_errors_fatal 12755 1727204091.70255: done checking for any_errors_fatal 12755 1727204091.70256: checking for max_fail_percentage 12755 1727204091.70259: done checking for max_fail_percentage 12755 1727204091.70260: checking to see if all hosts have failed and the running result is not ok 12755 1727204091.70261: done checking to see if all hosts have failed 12755 1727204091.70262: getting the remaining hosts for this loop 12755 1727204091.70264: done getting the remaining hosts for this loop 12755 1727204091.70269: getting the next task for host managed-node1 12755 1727204091.70279: done getting next task for host managed-node1 12755 1727204091.70286: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12755 1727204091.70290: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204091.70310: getting variables 12755 1727204091.70313: in VariableManager get_vars() 12755 1727204091.70381: Calling all_inventory to load vars for managed-node1 12755 1727204091.70385: Calling groups_inventory to load vars for managed-node1 12755 1727204091.70388: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204091.70606: Calling all_plugins_play to load vars for managed-node1 12755 1727204091.70616: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204091.70620: Calling groups_plugins_play to load vars for managed-node1 12755 1727204091.72997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204091.83179: done with get_vars() 12755 1727204091.83224: done getting variables 12755 1727204091.83301: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:54:51 -0400 (0:00:00.188) 0:00:17.069 ***** 12755 1727204091.83342: entering _queue_task() for managed-node1/fail 12755 1727204091.83776: worker is 1 (out of 1 available) 12755 1727204091.83794: exiting _queue_task() for managed-node1/fail 12755 1727204091.83858: done queuing things up, now waiting for results queue to drain 12755 1727204091.83862: waiting for pending results... 12755 1727204091.84268: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12755 1727204091.84305: in run() - task 12b410aa-8751-72e9-1a19-00000000002d 12755 1727204091.84361: variable 'ansible_search_path' from source: unknown 12755 1727204091.84365: variable 'ansible_search_path' from source: unknown 12755 1727204091.84380: calling self._execute() 12755 1727204091.84486: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204091.84504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204091.84530: variable 'omit' from source: magic vars 12755 1727204091.85126: variable 'ansible_distribution_major_version' from source: facts 12755 1727204091.85130: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204091.85313: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204091.85587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204091.88536: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204091.88625: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204091.88786: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204091.88793: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204091.88796: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204091.88835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204091.88871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204091.88906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204091.88960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204091.88976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204091.89038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204091.89067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204091.89100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204091.89152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204091.89170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204091.89227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204091.89253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204091.89283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204091.89335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204091.89352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204091.89578: variable 'network_connections' from source: task vars 12755 1727204091.89593: variable 'controller_profile' from source: play vars 12755 1727204091.89677: variable 'controller_profile' from source: play vars 12755 1727204091.89688: variable 'controller_device' from source: play vars 12755 1727204091.89765: variable 'controller_device' from source: play vars 12755 1727204091.89773: variable 'port1_profile' from source: play vars 12755 1727204091.89853: variable 'port1_profile' from source: play vars 12755 1727204091.89862: variable 'dhcp_interface1' from source: play vars 12755 1727204091.89941: variable 'dhcp_interface1' from source: play vars 12755 1727204091.89949: variable 'controller_profile' from source: play vars 12755 1727204091.90026: variable 'controller_profile' from source: play vars 12755 1727204091.90034: variable 'port2_profile' from source: play vars 12755 1727204091.90108: variable 'port2_profile' from source: play vars 12755 1727204091.90209: variable 'dhcp_interface2' from source: play vars 12755 1727204091.90212: variable 'dhcp_interface2' from source: play vars 12755 1727204091.90215: variable 'controller_profile' from source: play vars 12755 1727204091.90273: variable 'controller_profile' from source: play vars 12755 1727204091.90363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204091.90587: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204091.90640: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204091.90674: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204091.90757: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204091.90770: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204091.90793: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204091.90831: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204091.90860: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204091.90992: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204091.91262: variable 'network_connections' from source: task vars 12755 1727204091.91269: variable 'controller_profile' from source: play vars 12755 1727204091.91348: variable 'controller_profile' from source: play vars 12755 1727204091.91356: variable 'controller_device' from source: play vars 12755 1727204091.91432: variable 'controller_device' from source: play vars 12755 1727204091.91442: variable 'port1_profile' from source: play vars 12755 1727204091.91539: variable 'port1_profile' from source: play vars 12755 1727204091.91543: variable 'dhcp_interface1' from source: play vars 12755 1727204091.91600: variable 'dhcp_interface1' from source: play vars 12755 1727204091.91608: variable 'controller_profile' from source: play vars 12755 1727204091.91695: variable 'controller_profile' from source: play vars 12755 1727204091.91698: variable 'port2_profile' from source: play vars 12755 1727204091.91863: variable 'port2_profile' from source: play vars 12755 1727204091.91866: variable 'dhcp_interface2' from source: play vars 12755 1727204091.91869: variable 'dhcp_interface2' from source: play vars 12755 1727204091.91872: variable 'controller_profile' from source: play vars 12755 1727204091.91933: variable 'controller_profile' from source: play vars 12755 1727204091.91972: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12755 1727204091.91975: when evaluation is False, skipping this task 12755 1727204091.91978: _execute() done 12755 1727204091.91986: dumping result to json 12755 1727204091.91989: done dumping result, returning 12755 1727204091.91999: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-72e9-1a19-00000000002d] 12755 1727204091.92079: sending task result for task 12b410aa-8751-72e9-1a19-00000000002d 12755 1727204091.92154: done sending task result for task 12b410aa-8751-72e9-1a19-00000000002d 12755 1727204091.92157: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12755 1727204091.92220: no more pending results, returning what we have 12755 1727204091.92224: results queue empty 12755 1727204091.92225: checking for any_errors_fatal 12755 1727204091.92235: done checking for any_errors_fatal 12755 1727204091.92236: checking for max_fail_percentage 12755 1727204091.92237: done checking for max_fail_percentage 12755 1727204091.92238: checking to see if all hosts have failed and the running result is not ok 12755 1727204091.92239: done checking to see if all hosts have failed 12755 1727204091.92240: getting the remaining hosts for this loop 12755 1727204091.92242: done getting the remaining hosts for this loop 12755 1727204091.92246: getting the next task for host managed-node1 12755 1727204091.92254: done getting next task for host managed-node1 12755 1727204091.92258: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12755 1727204091.92261: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204091.92332: getting variables 12755 1727204091.92334: in VariableManager get_vars() 12755 1727204091.92398: Calling all_inventory to load vars for managed-node1 12755 1727204091.92401: Calling groups_inventory to load vars for managed-node1 12755 1727204091.92405: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204091.92416: Calling all_plugins_play to load vars for managed-node1 12755 1727204091.92419: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204091.92424: Calling groups_plugins_play to load vars for managed-node1 12755 1727204091.94730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204091.97815: done with get_vars() 12755 1727204091.97855: done getting variables 12755 1727204091.97937: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:54:51 -0400 (0:00:00.146) 0:00:17.215 ***** 12755 1727204091.97978: entering _queue_task() for managed-node1/package 12755 1727204091.98521: worker is 1 (out of 1 available) 12755 1727204091.98533: exiting _queue_task() for managed-node1/package 12755 1727204091.98545: done queuing things up, now waiting for results queue to drain 12755 1727204091.98547: waiting for pending results... 12755 1727204091.98794: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 12755 1727204091.98848: in run() - task 12b410aa-8751-72e9-1a19-00000000002e 12755 1727204091.98872: variable 'ansible_search_path' from source: unknown 12755 1727204091.98895: variable 'ansible_search_path' from source: unknown 12755 1727204091.99000: calling self._execute() 12755 1727204091.99058: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204091.99072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204091.99088: variable 'omit' from source: magic vars 12755 1727204091.99588: variable 'ansible_distribution_major_version' from source: facts 12755 1727204091.99610: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204091.99884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204092.00298: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204092.00321: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204092.00426: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204092.00472: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204092.00636: variable 'network_packages' from source: role '' defaults 12755 1727204092.00795: variable '__network_provider_setup' from source: role '' defaults 12755 1727204092.00815: variable '__network_service_name_default_nm' from source: role '' defaults 12755 1727204092.00913: variable '__network_service_name_default_nm' from source: role '' defaults 12755 1727204092.00954: variable '__network_packages_default_nm' from source: role '' defaults 12755 1727204092.01025: variable '__network_packages_default_nm' from source: role '' defaults 12755 1727204092.01324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204092.04443: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204092.04546: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204092.04592: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204092.04654: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204092.04687: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204092.04995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204092.04999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204092.05002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204092.05005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204092.05007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204092.05009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204092.05029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204092.05066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204092.05127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204092.05152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204092.05564: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12755 1727204092.05654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204092.05701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204092.05736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204092.05802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204092.05825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204092.06000: variable 'ansible_python' from source: facts 12755 1727204092.06009: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12755 1727204092.06123: variable '__network_wpa_supplicant_required' from source: role '' defaults 12755 1727204092.06240: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12755 1727204092.06422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204092.06469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204092.06509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204092.06655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204092.06660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204092.06663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204092.06764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204092.06768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204092.06812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204092.06834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204092.07045: variable 'network_connections' from source: task vars 12755 1727204092.07057: variable 'controller_profile' from source: play vars 12755 1727204092.07201: variable 'controller_profile' from source: play vars 12755 1727204092.07225: variable 'controller_device' from source: play vars 12755 1727204092.07359: variable 'controller_device' from source: play vars 12755 1727204092.07419: variable 'port1_profile' from source: play vars 12755 1727204092.07524: variable 'port1_profile' from source: play vars 12755 1727204092.07545: variable 'dhcp_interface1' from source: play vars 12755 1727204092.07688: variable 'dhcp_interface1' from source: play vars 12755 1727204092.07707: variable 'controller_profile' from source: play vars 12755 1727204092.07838: variable 'controller_profile' from source: play vars 12755 1727204092.07879: variable 'port2_profile' from source: play vars 12755 1727204092.08000: variable 'port2_profile' from source: play vars 12755 1727204092.08074: variable 'dhcp_interface2' from source: play vars 12755 1727204092.08148: variable 'dhcp_interface2' from source: play vars 12755 1727204092.08163: variable 'controller_profile' from source: play vars 12755 1727204092.08321: variable 'controller_profile' from source: play vars 12755 1727204092.08418: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204092.08462: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204092.08517: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204092.08566: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204092.08693: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204092.09073: variable 'network_connections' from source: task vars 12755 1727204092.09085: variable 'controller_profile' from source: play vars 12755 1727204092.09225: variable 'controller_profile' from source: play vars 12755 1727204092.09272: variable 'controller_device' from source: play vars 12755 1727204092.09383: variable 'controller_device' from source: play vars 12755 1727204092.09405: variable 'port1_profile' from source: play vars 12755 1727204092.09529: variable 'port1_profile' from source: play vars 12755 1727204092.09571: variable 'dhcp_interface1' from source: play vars 12755 1727204092.09659: variable 'dhcp_interface1' from source: play vars 12755 1727204092.09681: variable 'controller_profile' from source: play vars 12755 1727204092.09830: variable 'controller_profile' from source: play vars 12755 1727204092.09897: variable 'port2_profile' from source: play vars 12755 1727204092.09983: variable 'port2_profile' from source: play vars 12755 1727204092.10005: variable 'dhcp_interface2' from source: play vars 12755 1727204092.10210: variable 'dhcp_interface2' from source: play vars 12755 1727204092.10213: variable 'controller_profile' from source: play vars 12755 1727204092.10319: variable 'controller_profile' from source: play vars 12755 1727204092.10408: variable '__network_packages_default_wireless' from source: role '' defaults 12755 1727204092.10527: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204092.11005: variable 'network_connections' from source: task vars 12755 1727204092.11009: variable 'controller_profile' from source: play vars 12755 1727204092.11086: variable 'controller_profile' from source: play vars 12755 1727204092.11104: variable 'controller_device' from source: play vars 12755 1727204092.11195: variable 'controller_device' from source: play vars 12755 1727204092.11225: variable 'port1_profile' from source: play vars 12755 1727204092.11335: variable 'port1_profile' from source: play vars 12755 1727204092.11338: variable 'dhcp_interface1' from source: play vars 12755 1727204092.11410: variable 'dhcp_interface1' from source: play vars 12755 1727204092.11424: variable 'controller_profile' from source: play vars 12755 1727204092.11516: variable 'controller_profile' from source: play vars 12755 1727204092.11554: variable 'port2_profile' from source: play vars 12755 1727204092.11626: variable 'port2_profile' from source: play vars 12755 1727204092.11663: variable 'dhcp_interface2' from source: play vars 12755 1727204092.11736: variable 'dhcp_interface2' from source: play vars 12755 1727204092.11772: variable 'controller_profile' from source: play vars 12755 1727204092.11841: variable 'controller_profile' from source: play vars 12755 1727204092.12091: variable '__network_packages_default_team' from source: role '' defaults 12755 1727204092.12096: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204092.12441: variable 'network_connections' from source: task vars 12755 1727204092.12452: variable 'controller_profile' from source: play vars 12755 1727204092.12544: variable 'controller_profile' from source: play vars 12755 1727204092.12557: variable 'controller_device' from source: play vars 12755 1727204092.12638: variable 'controller_device' from source: play vars 12755 1727204092.12662: variable 'port1_profile' from source: play vars 12755 1727204092.12745: variable 'port1_profile' from source: play vars 12755 1727204092.12769: variable 'dhcp_interface1' from source: play vars 12755 1727204092.12852: variable 'dhcp_interface1' from source: play vars 12755 1727204092.12875: variable 'controller_profile' from source: play vars 12755 1727204092.12957: variable 'controller_profile' from source: play vars 12755 1727204092.12981: variable 'port2_profile' from source: play vars 12755 1727204092.13059: variable 'port2_profile' from source: play vars 12755 1727204092.13072: variable 'dhcp_interface2' from source: play vars 12755 1727204092.13160: variable 'dhcp_interface2' from source: play vars 12755 1727204092.13173: variable 'controller_profile' from source: play vars 12755 1727204092.13266: variable 'controller_profile' from source: play vars 12755 1727204092.13396: variable '__network_service_name_default_initscripts' from source: role '' defaults 12755 1727204092.13457: variable '__network_service_name_default_initscripts' from source: role '' defaults 12755 1727204092.13470: variable '__network_packages_default_initscripts' from source: role '' defaults 12755 1727204092.13560: variable '__network_packages_default_initscripts' from source: role '' defaults 12755 1727204092.13919: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12755 1727204092.14698: variable 'network_connections' from source: task vars 12755 1727204092.14702: variable 'controller_profile' from source: play vars 12755 1727204092.14752: variable 'controller_profile' from source: play vars 12755 1727204092.14771: variable 'controller_device' from source: play vars 12755 1727204092.14853: variable 'controller_device' from source: play vars 12755 1727204092.14866: variable 'port1_profile' from source: play vars 12755 1727204092.14937: variable 'port1_profile' from source: play vars 12755 1727204092.14959: variable 'dhcp_interface1' from source: play vars 12755 1727204092.15051: variable 'dhcp_interface1' from source: play vars 12755 1727204092.15054: variable 'controller_profile' from source: play vars 12755 1727204092.15128: variable 'controller_profile' from source: play vars 12755 1727204092.15142: variable 'port2_profile' from source: play vars 12755 1727204092.15270: variable 'port2_profile' from source: play vars 12755 1727204092.15278: variable 'dhcp_interface2' from source: play vars 12755 1727204092.15331: variable 'dhcp_interface2' from source: play vars 12755 1727204092.15344: variable 'controller_profile' from source: play vars 12755 1727204092.15431: variable 'controller_profile' from source: play vars 12755 1727204092.15446: variable 'ansible_distribution' from source: facts 12755 1727204092.15455: variable '__network_rh_distros' from source: role '' defaults 12755 1727204092.15487: variable 'ansible_distribution_major_version' from source: facts 12755 1727204092.15514: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12755 1727204092.15756: variable 'ansible_distribution' from source: facts 12755 1727204092.15766: variable '__network_rh_distros' from source: role '' defaults 12755 1727204092.15816: variable 'ansible_distribution_major_version' from source: facts 12755 1727204092.15824: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12755 1727204092.16049: variable 'ansible_distribution' from source: facts 12755 1727204092.16059: variable '__network_rh_distros' from source: role '' defaults 12755 1727204092.16070: variable 'ansible_distribution_major_version' from source: facts 12755 1727204092.16119: variable 'network_provider' from source: set_fact 12755 1727204092.16193: variable 'ansible_facts' from source: unknown 12755 1727204092.17352: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 12755 1727204092.17356: when evaluation is False, skipping this task 12755 1727204092.17358: _execute() done 12755 1727204092.17360: dumping result to json 12755 1727204092.17363: done dumping result, returning 12755 1727204092.17365: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-72e9-1a19-00000000002e] 12755 1727204092.17375: sending task result for task 12b410aa-8751-72e9-1a19-00000000002e skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 12755 1727204092.17676: no more pending results, returning what we have 12755 1727204092.17681: results queue empty 12755 1727204092.17682: checking for any_errors_fatal 12755 1727204092.17693: done checking for any_errors_fatal 12755 1727204092.17694: checking for max_fail_percentage 12755 1727204092.17696: done checking for max_fail_percentage 12755 1727204092.17697: checking to see if all hosts have failed and the running result is not ok 12755 1727204092.17698: done checking to see if all hosts have failed 12755 1727204092.17699: getting the remaining hosts for this loop 12755 1727204092.17701: done getting the remaining hosts for this loop 12755 1727204092.17707: getting the next task for host managed-node1 12755 1727204092.17716: done getting next task for host managed-node1 12755 1727204092.17722: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12755 1727204092.17725: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204092.17743: getting variables 12755 1727204092.17745: in VariableManager get_vars() 12755 1727204092.18021: Calling all_inventory to load vars for managed-node1 12755 1727204092.18025: Calling groups_inventory to load vars for managed-node1 12755 1727204092.18028: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204092.18036: done sending task result for task 12b410aa-8751-72e9-1a19-00000000002e 12755 1727204092.18039: WORKER PROCESS EXITING 12755 1727204092.18051: Calling all_plugins_play to load vars for managed-node1 12755 1727204092.18055: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204092.18059: Calling groups_plugins_play to load vars for managed-node1 12755 1727204092.20639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204092.23709: done with get_vars() 12755 1727204092.23747: done getting variables 12755 1727204092.23833: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:54:52 -0400 (0:00:00.258) 0:00:17.474 ***** 12755 1727204092.23872: entering _queue_task() for managed-node1/package 12755 1727204092.24260: worker is 1 (out of 1 available) 12755 1727204092.24275: exiting _queue_task() for managed-node1/package 12755 1727204092.24444: done queuing things up, now waiting for results queue to drain 12755 1727204092.24446: waiting for pending results... 12755 1727204092.24606: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12755 1727204092.24802: in run() - task 12b410aa-8751-72e9-1a19-00000000002f 12755 1727204092.24825: variable 'ansible_search_path' from source: unknown 12755 1727204092.24836: variable 'ansible_search_path' from source: unknown 12755 1727204092.24896: calling self._execute() 12755 1727204092.25095: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204092.25099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204092.25104: variable 'omit' from source: magic vars 12755 1727204092.25523: variable 'ansible_distribution_major_version' from source: facts 12755 1727204092.25553: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204092.25765: variable 'network_state' from source: role '' defaults 12755 1727204092.25769: Evaluated conditional (network_state != {}): False 12755 1727204092.25773: when evaluation is False, skipping this task 12755 1727204092.25777: _execute() done 12755 1727204092.25779: dumping result to json 12755 1727204092.25782: done dumping result, returning 12755 1727204092.25876: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-72e9-1a19-00000000002f] 12755 1727204092.25880: sending task result for task 12b410aa-8751-72e9-1a19-00000000002f 12755 1727204092.25967: done sending task result for task 12b410aa-8751-72e9-1a19-00000000002f 12755 1727204092.25972: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204092.26032: no more pending results, returning what we have 12755 1727204092.26036: results queue empty 12755 1727204092.26037: checking for any_errors_fatal 12755 1727204092.26043: done checking for any_errors_fatal 12755 1727204092.26044: checking for max_fail_percentage 12755 1727204092.26045: done checking for max_fail_percentage 12755 1727204092.26046: checking to see if all hosts have failed and the running result is not ok 12755 1727204092.26047: done checking to see if all hosts have failed 12755 1727204092.26048: getting the remaining hosts for this loop 12755 1727204092.26050: done getting the remaining hosts for this loop 12755 1727204092.26055: getting the next task for host managed-node1 12755 1727204092.26063: done getting next task for host managed-node1 12755 1727204092.26067: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12755 1727204092.26070: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204092.26090: getting variables 12755 1727204092.26092: in VariableManager get_vars() 12755 1727204092.26155: Calling all_inventory to load vars for managed-node1 12755 1727204092.26159: Calling groups_inventory to load vars for managed-node1 12755 1727204092.26162: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204092.26177: Calling all_plugins_play to load vars for managed-node1 12755 1727204092.26180: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204092.26184: Calling groups_plugins_play to load vars for managed-node1 12755 1727204092.28647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204092.31956: done with get_vars() 12755 1727204092.32002: done getting variables 12755 1727204092.32070: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:54:52 -0400 (0:00:00.082) 0:00:17.556 ***** 12755 1727204092.32120: entering _queue_task() for managed-node1/package 12755 1727204092.32628: worker is 1 (out of 1 available) 12755 1727204092.32640: exiting _queue_task() for managed-node1/package 12755 1727204092.32651: done queuing things up, now waiting for results queue to drain 12755 1727204092.32652: waiting for pending results... 12755 1727204092.32898: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12755 1727204092.32973: in run() - task 12b410aa-8751-72e9-1a19-000000000030 12755 1727204092.33009: variable 'ansible_search_path' from source: unknown 12755 1727204092.33102: variable 'ansible_search_path' from source: unknown 12755 1727204092.33106: calling self._execute() 12755 1727204092.33182: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204092.33208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204092.33232: variable 'omit' from source: magic vars 12755 1727204092.33717: variable 'ansible_distribution_major_version' from source: facts 12755 1727204092.33738: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204092.33919: variable 'network_state' from source: role '' defaults 12755 1727204092.33940: Evaluated conditional (network_state != {}): False 12755 1727204092.33950: when evaluation is False, skipping this task 12755 1727204092.33960: _execute() done 12755 1727204092.33981: dumping result to json 12755 1727204092.33994: done dumping result, returning 12755 1727204092.34083: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-72e9-1a19-000000000030] 12755 1727204092.34088: sending task result for task 12b410aa-8751-72e9-1a19-000000000030 12755 1727204092.34170: done sending task result for task 12b410aa-8751-72e9-1a19-000000000030 12755 1727204092.34173: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204092.34233: no more pending results, returning what we have 12755 1727204092.34237: results queue empty 12755 1727204092.34239: checking for any_errors_fatal 12755 1727204092.34247: done checking for any_errors_fatal 12755 1727204092.34248: checking for max_fail_percentage 12755 1727204092.34250: done checking for max_fail_percentage 12755 1727204092.34251: checking to see if all hosts have failed and the running result is not ok 12755 1727204092.34252: done checking to see if all hosts have failed 12755 1727204092.34253: getting the remaining hosts for this loop 12755 1727204092.34255: done getting the remaining hosts for this loop 12755 1727204092.34261: getting the next task for host managed-node1 12755 1727204092.34269: done getting next task for host managed-node1 12755 1727204092.34274: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12755 1727204092.34278: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204092.34300: getting variables 12755 1727204092.34302: in VariableManager get_vars() 12755 1727204092.34374: Calling all_inventory to load vars for managed-node1 12755 1727204092.34378: Calling groups_inventory to load vars for managed-node1 12755 1727204092.34381: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204092.34598: Calling all_plugins_play to load vars for managed-node1 12755 1727204092.34602: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204092.34606: Calling groups_plugins_play to load vars for managed-node1 12755 1727204092.36936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204092.40086: done with get_vars() 12755 1727204092.40129: done getting variables 12755 1727204092.40259: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:54:52 -0400 (0:00:00.081) 0:00:17.638 ***** 12755 1727204092.40311: entering _queue_task() for managed-node1/service 12755 1727204092.40313: Creating lock for service 12755 1727204092.40830: worker is 1 (out of 1 available) 12755 1727204092.40842: exiting _queue_task() for managed-node1/service 12755 1727204092.40855: done queuing things up, now waiting for results queue to drain 12755 1727204092.40856: waiting for pending results... 12755 1727204092.41284: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12755 1727204092.41292: in run() - task 12b410aa-8751-72e9-1a19-000000000031 12755 1727204092.41296: variable 'ansible_search_path' from source: unknown 12755 1727204092.41298: variable 'ansible_search_path' from source: unknown 12755 1727204092.41315: calling self._execute() 12755 1727204092.41435: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204092.41449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204092.41466: variable 'omit' from source: magic vars 12755 1727204092.41955: variable 'ansible_distribution_major_version' from source: facts 12755 1727204092.41976: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204092.42157: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204092.42440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204092.45688: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204092.46158: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204092.46208: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204092.46262: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204092.46316: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204092.46413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204092.46497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204092.46510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204092.46579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204092.46607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204092.46764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204092.46769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204092.46772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204092.46815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204092.46841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204092.46907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204092.46943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204092.46987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204092.47050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204092.47090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204092.47396: variable 'network_connections' from source: task vars 12755 1727204092.47399: variable 'controller_profile' from source: play vars 12755 1727204092.47472: variable 'controller_profile' from source: play vars 12755 1727204092.47502: variable 'controller_device' from source: play vars 12755 1727204092.47597: variable 'controller_device' from source: play vars 12755 1727204092.47614: variable 'port1_profile' from source: play vars 12755 1727204092.47703: variable 'port1_profile' from source: play vars 12755 1727204092.47717: variable 'dhcp_interface1' from source: play vars 12755 1727204092.47806: variable 'dhcp_interface1' from source: play vars 12755 1727204092.47820: variable 'controller_profile' from source: play vars 12755 1727204092.47949: variable 'controller_profile' from source: play vars 12755 1727204092.47959: variable 'port2_profile' from source: play vars 12755 1727204092.48018: variable 'port2_profile' from source: play vars 12755 1727204092.48034: variable 'dhcp_interface2' from source: play vars 12755 1727204092.48124: variable 'dhcp_interface2' from source: play vars 12755 1727204092.48136: variable 'controller_profile' from source: play vars 12755 1727204092.48288: variable 'controller_profile' from source: play vars 12755 1727204092.48321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204092.48548: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204092.48600: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204092.48672: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204092.48712: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204092.48773: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204092.48806: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204092.48868: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204092.48920: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204092.49051: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204092.49373: variable 'network_connections' from source: task vars 12755 1727204092.49399: variable 'controller_profile' from source: play vars 12755 1727204092.49619: variable 'controller_profile' from source: play vars 12755 1727204092.49623: variable 'controller_device' from source: play vars 12755 1727204092.49625: variable 'controller_device' from source: play vars 12755 1727204092.49628: variable 'port1_profile' from source: play vars 12755 1727204092.49710: variable 'port1_profile' from source: play vars 12755 1727204092.49734: variable 'dhcp_interface1' from source: play vars 12755 1727204092.50209: variable 'dhcp_interface1' from source: play vars 12755 1727204092.50213: variable 'controller_profile' from source: play vars 12755 1727204092.50233: variable 'controller_profile' from source: play vars 12755 1727204092.50246: variable 'port2_profile' from source: play vars 12755 1727204092.50537: variable 'port2_profile' from source: play vars 12755 1727204092.50540: variable 'dhcp_interface2' from source: play vars 12755 1727204092.50543: variable 'dhcp_interface2' from source: play vars 12755 1727204092.50545: variable 'controller_profile' from source: play vars 12755 1727204092.50702: variable 'controller_profile' from source: play vars 12755 1727204092.50867: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12755 1727204092.50878: when evaluation is False, skipping this task 12755 1727204092.50886: _execute() done 12755 1727204092.50897: dumping result to json 12755 1727204092.50905: done dumping result, returning 12755 1727204092.50920: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-72e9-1a19-000000000031] 12755 1727204092.50930: sending task result for task 12b410aa-8751-72e9-1a19-000000000031 12755 1727204092.51276: done sending task result for task 12b410aa-8751-72e9-1a19-000000000031 12755 1727204092.51280: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12755 1727204092.51346: no more pending results, returning what we have 12755 1727204092.51350: results queue empty 12755 1727204092.51351: checking for any_errors_fatal 12755 1727204092.51358: done checking for any_errors_fatal 12755 1727204092.51360: checking for max_fail_percentage 12755 1727204092.51361: done checking for max_fail_percentage 12755 1727204092.51362: checking to see if all hosts have failed and the running result is not ok 12755 1727204092.51363: done checking to see if all hosts have failed 12755 1727204092.51364: getting the remaining hosts for this loop 12755 1727204092.51366: done getting the remaining hosts for this loop 12755 1727204092.51372: getting the next task for host managed-node1 12755 1727204092.51381: done getting next task for host managed-node1 12755 1727204092.51385: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12755 1727204092.51391: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204092.51409: getting variables 12755 1727204092.51412: in VariableManager get_vars() 12755 1727204092.51479: Calling all_inventory to load vars for managed-node1 12755 1727204092.51483: Calling groups_inventory to load vars for managed-node1 12755 1727204092.51486: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204092.51805: Calling all_plugins_play to load vars for managed-node1 12755 1727204092.51810: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204092.51815: Calling groups_plugins_play to load vars for managed-node1 12755 1727204092.56829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204092.64132: done with get_vars() 12755 1727204092.64174: done getting variables 12755 1727204092.64257: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:54:52 -0400 (0:00:00.239) 0:00:17.880 ***** 12755 1727204092.64502: entering _queue_task() for managed-node1/service 12755 1727204092.65174: worker is 1 (out of 1 available) 12755 1727204092.65193: exiting _queue_task() for managed-node1/service 12755 1727204092.65208: done queuing things up, now waiting for results queue to drain 12755 1727204092.65210: waiting for pending results... 12755 1727204092.65925: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12755 1727204092.66178: in run() - task 12b410aa-8751-72e9-1a19-000000000032 12755 1727204092.66196: variable 'ansible_search_path' from source: unknown 12755 1727204092.66200: variable 'ansible_search_path' from source: unknown 12755 1727204092.66363: calling self._execute() 12755 1727204092.66601: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204092.66610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204092.66628: variable 'omit' from source: magic vars 12755 1727204092.67779: variable 'ansible_distribution_major_version' from source: facts 12755 1727204092.67784: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204092.68138: variable 'network_provider' from source: set_fact 12755 1727204092.68144: variable 'network_state' from source: role '' defaults 12755 1727204092.68157: Evaluated conditional (network_provider == "nm" or network_state != {}): True 12755 1727204092.68165: variable 'omit' from source: magic vars 12755 1727204092.68352: variable 'omit' from source: magic vars 12755 1727204092.68566: variable 'network_service_name' from source: role '' defaults 12755 1727204092.68655: variable 'network_service_name' from source: role '' defaults 12755 1727204092.68933: variable '__network_provider_setup' from source: role '' defaults 12755 1727204092.68940: variable '__network_service_name_default_nm' from source: role '' defaults 12755 1727204092.69191: variable '__network_service_name_default_nm' from source: role '' defaults 12755 1727204092.69195: variable '__network_packages_default_nm' from source: role '' defaults 12755 1727204092.69285: variable '__network_packages_default_nm' from source: role '' defaults 12755 1727204092.69652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204092.73391: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204092.73395: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204092.73439: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204092.73480: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204092.73516: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204092.73616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204092.73657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204092.73690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204092.73751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204092.73769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204092.73837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204092.73865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204092.73956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204092.73960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204092.73995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204092.74392: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12755 1727204092.74465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204092.74500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204092.74536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204092.74584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204092.74607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204092.74726: variable 'ansible_python' from source: facts 12755 1727204092.74751: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12755 1727204092.74856: variable '__network_wpa_supplicant_required' from source: role '' defaults 12755 1727204092.75218: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12755 1727204092.75435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204092.75464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204092.75598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204092.75655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204092.75672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204092.75953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204092.75995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204092.76190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204092.76195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204092.76198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204092.76661: variable 'network_connections' from source: task vars 12755 1727204092.76669: variable 'controller_profile' from source: play vars 12755 1727204092.77199: variable 'controller_profile' from source: play vars 12755 1727204092.77202: variable 'controller_device' from source: play vars 12755 1727204092.77204: variable 'controller_device' from source: play vars 12755 1727204092.77206: variable 'port1_profile' from source: play vars 12755 1727204092.77462: variable 'port1_profile' from source: play vars 12755 1727204092.77476: variable 'dhcp_interface1' from source: play vars 12755 1727204092.77567: variable 'dhcp_interface1' from source: play vars 12755 1727204092.77579: variable 'controller_profile' from source: play vars 12755 1727204092.77880: variable 'controller_profile' from source: play vars 12755 1727204092.77894: variable 'port2_profile' from source: play vars 12755 1727204092.77982: variable 'port2_profile' from source: play vars 12755 1727204092.78050: variable 'dhcp_interface2' from source: play vars 12755 1727204092.78290: variable 'dhcp_interface2' from source: play vars 12755 1727204092.78296: variable 'controller_profile' from source: play vars 12755 1727204092.78299: variable 'controller_profile' from source: play vars 12755 1727204092.78510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204092.78656: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204092.78713: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204092.78766: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204092.78815: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204092.78892: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204092.79141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204092.79180: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204092.79223: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204092.79280: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204092.80746: variable 'network_connections' from source: task vars 12755 1727204092.80751: variable 'controller_profile' from source: play vars 12755 1727204092.80761: variable 'controller_profile' from source: play vars 12755 1727204092.80775: variable 'controller_device' from source: play vars 12755 1727204092.81179: variable 'controller_device' from source: play vars 12755 1727204092.81182: variable 'port1_profile' from source: play vars 12755 1727204092.81185: variable 'port1_profile' from source: play vars 12755 1727204092.81188: variable 'dhcp_interface1' from source: play vars 12755 1727204092.81391: variable 'dhcp_interface1' from source: play vars 12755 1727204092.81404: variable 'controller_profile' from source: play vars 12755 1727204092.81493: variable 'controller_profile' from source: play vars 12755 1727204092.81810: variable 'port2_profile' from source: play vars 12755 1727204092.81908: variable 'port2_profile' from source: play vars 12755 1727204092.81923: variable 'dhcp_interface2' from source: play vars 12755 1727204092.82214: variable 'dhcp_interface2' from source: play vars 12755 1727204092.82230: variable 'controller_profile' from source: play vars 12755 1727204092.82409: variable 'controller_profile' from source: play vars 12755 1727204092.82471: variable '__network_packages_default_wireless' from source: role '' defaults 12755 1727204092.82580: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204092.83869: variable 'network_connections' from source: task vars 12755 1727204092.83873: variable 'controller_profile' from source: play vars 12755 1727204092.83893: variable 'controller_profile' from source: play vars 12755 1727204092.84142: variable 'controller_device' from source: play vars 12755 1727204092.84146: variable 'controller_device' from source: play vars 12755 1727204092.84148: variable 'port1_profile' from source: play vars 12755 1727204092.84185: variable 'port1_profile' from source: play vars 12755 1727204092.84404: variable 'dhcp_interface1' from source: play vars 12755 1727204092.84486: variable 'dhcp_interface1' from source: play vars 12755 1727204092.84497: variable 'controller_profile' from source: play vars 12755 1727204092.84593: variable 'controller_profile' from source: play vars 12755 1727204092.84801: variable 'port2_profile' from source: play vars 12755 1727204092.84898: variable 'port2_profile' from source: play vars 12755 1727204092.84909: variable 'dhcp_interface2' from source: play vars 12755 1727204092.84988: variable 'dhcp_interface2' from source: play vars 12755 1727204092.85202: variable 'controller_profile' from source: play vars 12755 1727204092.85285: variable 'controller_profile' from source: play vars 12755 1727204092.85525: variable '__network_packages_default_team' from source: role '' defaults 12755 1727204092.85645: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204092.86747: variable 'network_connections' from source: task vars 12755 1727204092.86751: variable 'controller_profile' from source: play vars 12755 1727204092.87146: variable 'controller_profile' from source: play vars 12755 1727204092.87154: variable 'controller_device' from source: play vars 12755 1727204092.87475: variable 'controller_device' from source: play vars 12755 1727204092.87478: variable 'port1_profile' from source: play vars 12755 1727204092.87561: variable 'port1_profile' from source: play vars 12755 1727204092.87584: variable 'dhcp_interface1' from source: play vars 12755 1727204092.88058: variable 'dhcp_interface1' from source: play vars 12755 1727204092.88066: variable 'controller_profile' from source: play vars 12755 1727204092.88259: variable 'controller_profile' from source: play vars 12755 1727204092.88265: variable 'port2_profile' from source: play vars 12755 1727204092.88695: variable 'port2_profile' from source: play vars 12755 1727204092.88700: variable 'dhcp_interface2' from source: play vars 12755 1727204092.88704: variable 'dhcp_interface2' from source: play vars 12755 1727204092.88707: variable 'controller_profile' from source: play vars 12755 1727204092.88826: variable 'controller_profile' from source: play vars 12755 1727204092.89416: variable '__network_service_name_default_initscripts' from source: role '' defaults 12755 1727204092.89588: variable '__network_service_name_default_initscripts' from source: role '' defaults 12755 1727204092.89594: variable '__network_packages_default_initscripts' from source: role '' defaults 12755 1727204092.89999: variable '__network_packages_default_initscripts' from source: role '' defaults 12755 1727204092.90745: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12755 1727204092.92405: variable 'network_connections' from source: task vars 12755 1727204092.92411: variable 'controller_profile' from source: play vars 12755 1727204092.92542: variable 'controller_profile' from source: play vars 12755 1727204092.92600: variable 'controller_device' from source: play vars 12755 1727204092.92673: variable 'controller_device' from source: play vars 12755 1727204092.92683: variable 'port1_profile' from source: play vars 12755 1727204092.92870: variable 'port1_profile' from source: play vars 12755 1727204092.92873: variable 'dhcp_interface1' from source: play vars 12755 1727204092.93000: variable 'dhcp_interface1' from source: play vars 12755 1727204092.93006: variable 'controller_profile' from source: play vars 12755 1727204092.93095: variable 'controller_profile' from source: play vars 12755 1727204092.93098: variable 'port2_profile' from source: play vars 12755 1727204092.93479: variable 'port2_profile' from source: play vars 12755 1727204092.93530: variable 'dhcp_interface2' from source: play vars 12755 1727204092.93669: variable 'dhcp_interface2' from source: play vars 12755 1727204092.93997: variable 'controller_profile' from source: play vars 12755 1727204092.94001: variable 'controller_profile' from source: play vars 12755 1727204092.94013: variable 'ansible_distribution' from source: facts 12755 1727204092.94016: variable '__network_rh_distros' from source: role '' defaults 12755 1727204092.94021: variable 'ansible_distribution_major_version' from source: facts 12755 1727204092.94132: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12755 1727204092.94585: variable 'ansible_distribution' from source: facts 12755 1727204092.94662: variable '__network_rh_distros' from source: role '' defaults 12755 1727204092.94674: variable 'ansible_distribution_major_version' from source: facts 12755 1727204092.94686: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12755 1727204092.95244: variable 'ansible_distribution' from source: facts 12755 1727204092.95257: variable '__network_rh_distros' from source: role '' defaults 12755 1727204092.95270: variable 'ansible_distribution_major_version' from source: facts 12755 1727204092.95330: variable 'network_provider' from source: set_fact 12755 1727204092.95447: variable 'omit' from source: magic vars 12755 1727204092.95487: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204092.95603: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204092.95645: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204092.95796: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204092.95991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204092.95998: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204092.96000: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204092.96003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204092.96113: Set connection var ansible_connection to ssh 12755 1727204092.96133: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204092.96143: Set connection var ansible_shell_type to sh 12755 1727204092.96163: Set connection var ansible_timeout to 10 12755 1727204092.96176: Set connection var ansible_shell_executable to /bin/sh 12755 1727204092.96192: Set connection var ansible_pipelining to False 12755 1727204092.96234: variable 'ansible_shell_executable' from source: unknown 12755 1727204092.96245: variable 'ansible_connection' from source: unknown 12755 1727204092.96255: variable 'ansible_module_compression' from source: unknown 12755 1727204092.96263: variable 'ansible_shell_type' from source: unknown 12755 1727204092.96270: variable 'ansible_shell_executable' from source: unknown 12755 1727204092.96278: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204092.96287: variable 'ansible_pipelining' from source: unknown 12755 1727204092.96299: variable 'ansible_timeout' from source: unknown 12755 1727204092.96314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204092.96456: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204092.96475: variable 'omit' from source: magic vars 12755 1727204092.96487: starting attempt loop 12755 1727204092.96498: running the handler 12755 1727204092.96610: variable 'ansible_facts' from source: unknown 12755 1727204092.98180: _low_level_execute_command(): starting 12755 1727204092.98184: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204092.98872: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204092.98887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204092.98908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204092.98940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204092.99047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204092.99304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204092.99382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204093.01137: stdout chunk (state=3): >>>/root <<< 12755 1727204093.01328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204093.01332: stdout chunk (state=3): >>><<< 12755 1727204093.01335: stderr chunk (state=3): >>><<< 12755 1727204093.01356: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204093.01395: _low_level_execute_command(): starting 12755 1727204093.01399: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204093.0136757-13829-133127436095251 `" && echo ansible-tmp-1727204093.0136757-13829-133127436095251="` echo /root/.ansible/tmp/ansible-tmp-1727204093.0136757-13829-133127436095251 `" ) && sleep 0' 12755 1727204093.02776: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204093.02797: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204093.02920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204093.02926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204093.03009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204093.03032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204093.03072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204093.03156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204093.05254: stdout chunk (state=3): >>>ansible-tmp-1727204093.0136757-13829-133127436095251=/root/.ansible/tmp/ansible-tmp-1727204093.0136757-13829-133127436095251 <<< 12755 1727204093.05374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204093.05477: stderr chunk (state=3): >>><<< 12755 1727204093.05502: stdout chunk (state=3): >>><<< 12755 1727204093.05694: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204093.0136757-13829-133127436095251=/root/.ansible/tmp/ansible-tmp-1727204093.0136757-13829-133127436095251 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204093.05697: variable 'ansible_module_compression' from source: unknown 12755 1727204093.05701: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 12755 1727204093.05703: ANSIBALLZ: Acquiring lock 12755 1727204093.05705: ANSIBALLZ: Lock acquired: 139630693732560 12755 1727204093.05713: ANSIBALLZ: Creating module 12755 1727204093.58241: ANSIBALLZ: Writing module into payload 12755 1727204093.58513: ANSIBALLZ: Writing module 12755 1727204093.58897: ANSIBALLZ: Renaming module 12755 1727204093.58901: ANSIBALLZ: Done creating module 12755 1727204093.58903: variable 'ansible_facts' from source: unknown 12755 1727204093.58913: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204093.0136757-13829-133127436095251/AnsiballZ_systemd.py 12755 1727204093.59091: Sending initial data 12755 1727204093.59095: Sent initial data (156 bytes) 12755 1727204093.59805: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204093.59844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204093.59857: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204093.59866: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204093.59956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204093.61701: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204093.61756: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204093.61814: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpkx_7h8tl /root/.ansible/tmp/ansible-tmp-1727204093.0136757-13829-133127436095251/AnsiballZ_systemd.py <<< 12755 1727204093.61821: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204093.0136757-13829-133127436095251/AnsiballZ_systemd.py" <<< 12755 1727204093.61861: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpkx_7h8tl" to remote "/root/.ansible/tmp/ansible-tmp-1727204093.0136757-13829-133127436095251/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204093.0136757-13829-133127436095251/AnsiballZ_systemd.py" <<< 12755 1727204093.64550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204093.64632: stderr chunk (state=3): >>><<< 12755 1727204093.64771: stdout chunk (state=3): >>><<< 12755 1727204093.64775: done transferring module to remote 12755 1727204093.64778: _low_level_execute_command(): starting 12755 1727204093.64781: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204093.0136757-13829-133127436095251/ /root/.ansible/tmp/ansible-tmp-1727204093.0136757-13829-133127436095251/AnsiballZ_systemd.py && sleep 0' 12755 1727204093.65745: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204093.65749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204093.65752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12755 1727204093.65755: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204093.65757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204093.65873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204093.66049: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204093.66143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204093.68173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204093.68177: stdout chunk (state=3): >>><<< 12755 1727204093.68185: stderr chunk (state=3): >>><<< 12755 1727204093.68212: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204093.68215: _low_level_execute_command(): starting 12755 1727204093.68221: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204093.0136757-13829-133127436095251/AnsiballZ_systemd.py && sleep 0' 12755 1727204093.69095: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204093.69099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204093.69105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204093.69108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204093.69110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204093.69113: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204093.69116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204093.69121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204093.69123: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204093.69126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204093.69179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204094.03098: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "651", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ExecMainStartTimestampMonotonic": "17567139", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "651", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11997184", "MemoryAvailable": "infinity", "CPUUsageNSec": "777658000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "inf<<< 12755 1727204094.03133: stdout chunk (state=3): >>>inity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service multi-user.target cloud-init.service NetworkManager-wait-online.service network.target shutdown.target", "After": "dbus-broker.service system.slice dbus.socket systemd-journald.socket basic.target sysinit.target network-pre.target cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:51 EDT", "StateChangeTimestampMonotonic": "521403753", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:27 EDT", "InactiveExitTimestampMonotonic": "17567399", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ActiveEnterTimestampMonotonic": "18019295", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ConditionTimestampMonotonic": "17554557", "AssertTimestamp": "Tue 2024-09-24 14:45:27 EDT", "AssertTimestampMonotonic": "17554559", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac0fd3fc06b14ac59a7d5e4a43cc5865", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 12755 1727204094.05398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204094.05402: stderr chunk (state=3): >>>Shared connection to 10.31.11.210 closed. <<< 12755 1727204094.05405: stderr chunk (state=3): >>><<< 12755 1727204094.05407: stdout chunk (state=3): >>><<< 12755 1727204094.05410: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "651", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ExecMainStartTimestampMonotonic": "17567139", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "651", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11997184", "MemoryAvailable": "infinity", "CPUUsageNSec": "777658000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service multi-user.target cloud-init.service NetworkManager-wait-online.service network.target shutdown.target", "After": "dbus-broker.service system.slice dbus.socket systemd-journald.socket basic.target sysinit.target network-pre.target cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:51 EDT", "StateChangeTimestampMonotonic": "521403753", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:27 EDT", "InactiveExitTimestampMonotonic": "17567399", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ActiveEnterTimestampMonotonic": "18019295", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ConditionTimestampMonotonic": "17554557", "AssertTimestamp": "Tue 2024-09-24 14:45:27 EDT", "AssertTimestampMonotonic": "17554559", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac0fd3fc06b14ac59a7d5e4a43cc5865", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204094.05649: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204093.0136757-13829-133127436095251/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204094.05670: _low_level_execute_command(): starting 12755 1727204094.05676: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204093.0136757-13829-133127436095251/ > /dev/null 2>&1 && sleep 0' 12755 1727204094.06428: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204094.06510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204094.06567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204094.06606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204094.06627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204094.06717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204094.08750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204094.08767: stdout chunk (state=3): >>><<< 12755 1727204094.08786: stderr chunk (state=3): >>><<< 12755 1727204094.08895: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204094.08899: handler run complete 12755 1727204094.08918: attempt loop complete, returning result 12755 1727204094.08927: _execute() done 12755 1727204094.08935: dumping result to json 12755 1727204094.08963: done dumping result, returning 12755 1727204094.08980: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-72e9-1a19-000000000032] 12755 1727204094.08993: sending task result for task 12b410aa-8751-72e9-1a19-000000000032 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204094.09620: no more pending results, returning what we have 12755 1727204094.09624: results queue empty 12755 1727204094.09625: checking for any_errors_fatal 12755 1727204094.09634: done checking for any_errors_fatal 12755 1727204094.09635: checking for max_fail_percentage 12755 1727204094.09637: done checking for max_fail_percentage 12755 1727204094.09638: checking to see if all hosts have failed and the running result is not ok 12755 1727204094.09640: done checking to see if all hosts have failed 12755 1727204094.09640: getting the remaining hosts for this loop 12755 1727204094.09642: done getting the remaining hosts for this loop 12755 1727204094.09647: getting the next task for host managed-node1 12755 1727204094.09655: done getting next task for host managed-node1 12755 1727204094.09660: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12755 1727204094.09663: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204094.09679: getting variables 12755 1727204094.09681: in VariableManager get_vars() 12755 1727204094.09739: Calling all_inventory to load vars for managed-node1 12755 1727204094.09743: Calling groups_inventory to load vars for managed-node1 12755 1727204094.09745: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204094.09757: Calling all_plugins_play to load vars for managed-node1 12755 1727204094.09760: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204094.09763: Calling groups_plugins_play to load vars for managed-node1 12755 1727204094.10596: done sending task result for task 12b410aa-8751-72e9-1a19-000000000032 12755 1727204094.10600: WORKER PROCESS EXITING 12755 1727204094.12243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204094.15212: done with get_vars() 12755 1727204094.15251: done getting variables 12755 1727204094.15330: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:54:54 -0400 (0:00:01.508) 0:00:19.389 ***** 12755 1727204094.15370: entering _queue_task() for managed-node1/service 12755 1727204094.15722: worker is 1 (out of 1 available) 12755 1727204094.15738: exiting _queue_task() for managed-node1/service 12755 1727204094.15753: done queuing things up, now waiting for results queue to drain 12755 1727204094.15754: waiting for pending results... 12755 1727204094.16060: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12755 1727204094.16241: in run() - task 12b410aa-8751-72e9-1a19-000000000033 12755 1727204094.16262: variable 'ansible_search_path' from source: unknown 12755 1727204094.16269: variable 'ansible_search_path' from source: unknown 12755 1727204094.16319: calling self._execute() 12755 1727204094.16436: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204094.16452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204094.16566: variable 'omit' from source: magic vars 12755 1727204094.17295: variable 'ansible_distribution_major_version' from source: facts 12755 1727204094.17299: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204094.17599: variable 'network_provider' from source: set_fact 12755 1727204094.17603: Evaluated conditional (network_provider == "nm"): True 12755 1727204094.17759: variable '__network_wpa_supplicant_required' from source: role '' defaults 12755 1727204094.17877: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12755 1727204094.18450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204094.22763: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204094.22996: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204094.23057: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204094.23164: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204094.23261: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204094.23496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204094.23577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204094.23701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204094.23763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204094.23990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204094.23996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204094.24207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204094.24210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204094.24213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204094.24307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204094.24375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204094.24535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204094.24572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204094.24719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204094.24744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204094.25009: variable 'network_connections' from source: task vars 12755 1727204094.25026: variable 'controller_profile' from source: play vars 12755 1727204094.25113: variable 'controller_profile' from source: play vars 12755 1727204094.25124: variable 'controller_device' from source: play vars 12755 1727204094.25297: variable 'controller_device' from source: play vars 12755 1727204094.25301: variable 'port1_profile' from source: play vars 12755 1727204094.25303: variable 'port1_profile' from source: play vars 12755 1727204094.25306: variable 'dhcp_interface1' from source: play vars 12755 1727204094.25380: variable 'dhcp_interface1' from source: play vars 12755 1727204094.25387: variable 'controller_profile' from source: play vars 12755 1727204094.25468: variable 'controller_profile' from source: play vars 12755 1727204094.25476: variable 'port2_profile' from source: play vars 12755 1727204094.25556: variable 'port2_profile' from source: play vars 12755 1727204094.25564: variable 'dhcp_interface2' from source: play vars 12755 1727204094.25643: variable 'dhcp_interface2' from source: play vars 12755 1727204094.25651: variable 'controller_profile' from source: play vars 12755 1727204094.25723: variable 'controller_profile' from source: play vars 12755 1727204094.25821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204094.26041: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204094.26097: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204094.26167: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204094.26170: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204094.26215: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204094.26242: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204094.26276: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204094.29481: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204094.29485: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204094.29488: variable 'network_connections' from source: task vars 12755 1727204094.29492: variable 'controller_profile' from source: play vars 12755 1727204094.29494: variable 'controller_profile' from source: play vars 12755 1727204094.29496: variable 'controller_device' from source: play vars 12755 1727204094.29498: variable 'controller_device' from source: play vars 12755 1727204094.29500: variable 'port1_profile' from source: play vars 12755 1727204094.29502: variable 'port1_profile' from source: play vars 12755 1727204094.29503: variable 'dhcp_interface1' from source: play vars 12755 1727204094.29505: variable 'dhcp_interface1' from source: play vars 12755 1727204094.29507: variable 'controller_profile' from source: play vars 12755 1727204094.29509: variable 'controller_profile' from source: play vars 12755 1727204094.29511: variable 'port2_profile' from source: play vars 12755 1727204094.29513: variable 'port2_profile' from source: play vars 12755 1727204094.29515: variable 'dhcp_interface2' from source: play vars 12755 1727204094.29520: variable 'dhcp_interface2' from source: play vars 12755 1727204094.29522: variable 'controller_profile' from source: play vars 12755 1727204094.29524: variable 'controller_profile' from source: play vars 12755 1727204094.29525: Evaluated conditional (__network_wpa_supplicant_required): False 12755 1727204094.29527: when evaluation is False, skipping this task 12755 1727204094.29529: _execute() done 12755 1727204094.29531: dumping result to json 12755 1727204094.29533: done dumping result, returning 12755 1727204094.29535: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-72e9-1a19-000000000033] 12755 1727204094.29537: sending task result for task 12b410aa-8751-72e9-1a19-000000000033 12755 1727204094.29619: done sending task result for task 12b410aa-8751-72e9-1a19-000000000033 12755 1727204094.29624: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 12755 1727204094.29709: no more pending results, returning what we have 12755 1727204094.29713: results queue empty 12755 1727204094.29714: checking for any_errors_fatal 12755 1727204094.29736: done checking for any_errors_fatal 12755 1727204094.29737: checking for max_fail_percentage 12755 1727204094.29739: done checking for max_fail_percentage 12755 1727204094.29740: checking to see if all hosts have failed and the running result is not ok 12755 1727204094.29741: done checking to see if all hosts have failed 12755 1727204094.29742: getting the remaining hosts for this loop 12755 1727204094.29743: done getting the remaining hosts for this loop 12755 1727204094.29747: getting the next task for host managed-node1 12755 1727204094.29753: done getting next task for host managed-node1 12755 1727204094.29758: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12755 1727204094.29760: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204094.29775: getting variables 12755 1727204094.29776: in VariableManager get_vars() 12755 1727204094.29840: Calling all_inventory to load vars for managed-node1 12755 1727204094.29844: Calling groups_inventory to load vars for managed-node1 12755 1727204094.29847: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204094.29860: Calling all_plugins_play to load vars for managed-node1 12755 1727204094.29864: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204094.29868: Calling groups_plugins_play to load vars for managed-node1 12755 1727204094.33542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204094.38171: done with get_vars() 12755 1727204094.38224: done getting variables 12755 1727204094.38304: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:54:54 -0400 (0:00:00.229) 0:00:19.619 ***** 12755 1727204094.38350: entering _queue_task() for managed-node1/service 12755 1727204094.38814: worker is 1 (out of 1 available) 12755 1727204094.38828: exiting _queue_task() for managed-node1/service 12755 1727204094.38843: done queuing things up, now waiting for results queue to drain 12755 1727204094.38845: waiting for pending results... 12755 1727204094.39639: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 12755 1727204094.39645: in run() - task 12b410aa-8751-72e9-1a19-000000000034 12755 1727204094.39649: variable 'ansible_search_path' from source: unknown 12755 1727204094.39652: variable 'ansible_search_path' from source: unknown 12755 1727204094.39661: calling self._execute() 12755 1727204094.39813: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204094.39831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204094.39857: variable 'omit' from source: magic vars 12755 1727204094.40337: variable 'ansible_distribution_major_version' from source: facts 12755 1727204094.40359: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204094.40529: variable 'network_provider' from source: set_fact 12755 1727204094.40543: Evaluated conditional (network_provider == "initscripts"): False 12755 1727204094.40552: when evaluation is False, skipping this task 12755 1727204094.40559: _execute() done 12755 1727204094.40569: dumping result to json 12755 1727204094.40608: done dumping result, returning 12755 1727204094.40612: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-72e9-1a19-000000000034] 12755 1727204094.40718: sending task result for task 12b410aa-8751-72e9-1a19-000000000034 12755 1727204094.40803: done sending task result for task 12b410aa-8751-72e9-1a19-000000000034 12755 1727204094.40808: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204094.40864: no more pending results, returning what we have 12755 1727204094.40868: results queue empty 12755 1727204094.40869: checking for any_errors_fatal 12755 1727204094.40882: done checking for any_errors_fatal 12755 1727204094.40883: checking for max_fail_percentage 12755 1727204094.40885: done checking for max_fail_percentage 12755 1727204094.40886: checking to see if all hosts have failed and the running result is not ok 12755 1727204094.40887: done checking to see if all hosts have failed 12755 1727204094.40888: getting the remaining hosts for this loop 12755 1727204094.40891: done getting the remaining hosts for this loop 12755 1727204094.40896: getting the next task for host managed-node1 12755 1727204094.40904: done getting next task for host managed-node1 12755 1727204094.40908: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12755 1727204094.40911: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204094.40930: getting variables 12755 1727204094.40933: in VariableManager get_vars() 12755 1727204094.41058: Calling all_inventory to load vars for managed-node1 12755 1727204094.41062: Calling groups_inventory to load vars for managed-node1 12755 1727204094.41065: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204094.41080: Calling all_plugins_play to load vars for managed-node1 12755 1727204094.41083: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204094.41087: Calling groups_plugins_play to load vars for managed-node1 12755 1727204094.43543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204094.46468: done with get_vars() 12755 1727204094.46516: done getting variables 12755 1727204094.46594: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:54:54 -0400 (0:00:00.082) 0:00:19.702 ***** 12755 1727204094.46640: entering _queue_task() for managed-node1/copy 12755 1727204094.47010: worker is 1 (out of 1 available) 12755 1727204094.47026: exiting _queue_task() for managed-node1/copy 12755 1727204094.47155: done queuing things up, now waiting for results queue to drain 12755 1727204094.47157: waiting for pending results... 12755 1727204094.47609: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12755 1727204094.47615: in run() - task 12b410aa-8751-72e9-1a19-000000000035 12755 1727204094.47621: variable 'ansible_search_path' from source: unknown 12755 1727204094.47625: variable 'ansible_search_path' from source: unknown 12755 1727204094.47628: calling self._execute() 12755 1727204094.47746: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204094.47755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204094.47768: variable 'omit' from source: magic vars 12755 1727204094.48254: variable 'ansible_distribution_major_version' from source: facts 12755 1727204094.48269: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204094.48426: variable 'network_provider' from source: set_fact 12755 1727204094.48433: Evaluated conditional (network_provider == "initscripts"): False 12755 1727204094.48437: when evaluation is False, skipping this task 12755 1727204094.48440: _execute() done 12755 1727204094.48443: dumping result to json 12755 1727204094.48449: done dumping result, returning 12755 1727204094.48459: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-72e9-1a19-000000000035] 12755 1727204094.48475: sending task result for task 12b410aa-8751-72e9-1a19-000000000035 12755 1727204094.48688: done sending task result for task 12b410aa-8751-72e9-1a19-000000000035 12755 1727204094.48694: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12755 1727204094.48749: no more pending results, returning what we have 12755 1727204094.48753: results queue empty 12755 1727204094.48755: checking for any_errors_fatal 12755 1727204094.48760: done checking for any_errors_fatal 12755 1727204094.48762: checking for max_fail_percentage 12755 1727204094.48763: done checking for max_fail_percentage 12755 1727204094.48764: checking to see if all hosts have failed and the running result is not ok 12755 1727204094.48766: done checking to see if all hosts have failed 12755 1727204094.48767: getting the remaining hosts for this loop 12755 1727204094.48768: done getting the remaining hosts for this loop 12755 1727204094.48773: getting the next task for host managed-node1 12755 1727204094.48780: done getting next task for host managed-node1 12755 1727204094.48785: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12755 1727204094.48894: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204094.48912: getting variables 12755 1727204094.48914: in VariableManager get_vars() 12755 1727204094.48972: Calling all_inventory to load vars for managed-node1 12755 1727204094.48976: Calling groups_inventory to load vars for managed-node1 12755 1727204094.48979: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204094.48994: Calling all_plugins_play to load vars for managed-node1 12755 1727204094.48998: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204094.49007: Calling groups_plugins_play to load vars for managed-node1 12755 1727204094.51230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204094.54101: done with get_vars() 12755 1727204094.54149: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:54:54 -0400 (0:00:00.076) 0:00:19.778 ***** 12755 1727204094.54253: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 12755 1727204094.54255: Creating lock for fedora.linux_system_roles.network_connections 12755 1727204094.54611: worker is 1 (out of 1 available) 12755 1727204094.54629: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 12755 1727204094.54643: done queuing things up, now waiting for results queue to drain 12755 1727204094.54644: waiting for pending results... 12755 1727204094.54949: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12755 1727204094.55099: in run() - task 12b410aa-8751-72e9-1a19-000000000036 12755 1727204094.55196: variable 'ansible_search_path' from source: unknown 12755 1727204094.55201: variable 'ansible_search_path' from source: unknown 12755 1727204094.55204: calling self._execute() 12755 1727204094.55267: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204094.55276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204094.55286: variable 'omit' from source: magic vars 12755 1727204094.55772: variable 'ansible_distribution_major_version' from source: facts 12755 1727204094.55793: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204094.55801: variable 'omit' from source: magic vars 12755 1727204094.55877: variable 'omit' from source: magic vars 12755 1727204094.56103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204094.58826: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204094.58830: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204094.58859: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204094.58904: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204094.58941: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204094.59042: variable 'network_provider' from source: set_fact 12755 1727204094.59214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204094.59268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204094.59310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204094.59367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204094.59392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204094.59480: variable 'omit' from source: magic vars 12755 1727204094.59630: variable 'omit' from source: magic vars 12755 1727204094.59808: variable 'network_connections' from source: task vars 12755 1727204094.59811: variable 'controller_profile' from source: play vars 12755 1727204094.59858: variable 'controller_profile' from source: play vars 12755 1727204094.59867: variable 'controller_device' from source: play vars 12755 1727204094.59990: variable 'controller_device' from source: play vars 12755 1727204094.59994: variable 'port1_profile' from source: play vars 12755 1727204094.60051: variable 'port1_profile' from source: play vars 12755 1727204094.60060: variable 'dhcp_interface1' from source: play vars 12755 1727204094.60133: variable 'dhcp_interface1' from source: play vars 12755 1727204094.60141: variable 'controller_profile' from source: play vars 12755 1727204094.60227: variable 'controller_profile' from source: play vars 12755 1727204094.60236: variable 'port2_profile' from source: play vars 12755 1727204094.60314: variable 'port2_profile' from source: play vars 12755 1727204094.60332: variable 'dhcp_interface2' from source: play vars 12755 1727204094.60587: variable 'dhcp_interface2' from source: play vars 12755 1727204094.60592: variable 'controller_profile' from source: play vars 12755 1727204094.60594: variable 'controller_profile' from source: play vars 12755 1727204094.60724: variable 'omit' from source: magic vars 12755 1727204094.60733: variable '__lsr_ansible_managed' from source: task vars 12755 1727204094.60995: variable '__lsr_ansible_managed' from source: task vars 12755 1727204094.61036: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 12755 1727204094.61331: Loaded config def from plugin (lookup/template) 12755 1727204094.61336: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 12755 1727204094.61378: File lookup term: get_ansible_managed.j2 12755 1727204094.61381: variable 'ansible_search_path' from source: unknown 12755 1727204094.61391: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 12755 1727204094.61407: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 12755 1727204094.61424: variable 'ansible_search_path' from source: unknown 12755 1727204094.72331: variable 'ansible_managed' from source: unknown 12755 1727204094.72770: variable 'omit' from source: magic vars 12755 1727204094.72916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204094.72992: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204094.73024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204094.73053: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204094.73072: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204094.73114: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204094.73129: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204094.73140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204094.73322: Set connection var ansible_connection to ssh 12755 1727204094.73338: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204094.73347: Set connection var ansible_shell_type to sh 12755 1727204094.73395: Set connection var ansible_timeout to 10 12755 1727204094.73399: Set connection var ansible_shell_executable to /bin/sh 12755 1727204094.73443: Set connection var ansible_pipelining to False 12755 1727204094.73496: variable 'ansible_shell_executable' from source: unknown 12755 1727204094.73499: variable 'ansible_connection' from source: unknown 12755 1727204094.73502: variable 'ansible_module_compression' from source: unknown 12755 1727204094.73504: variable 'ansible_shell_type' from source: unknown 12755 1727204094.73506: variable 'ansible_shell_executable' from source: unknown 12755 1727204094.73508: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204094.73510: variable 'ansible_pipelining' from source: unknown 12755 1727204094.73519: variable 'ansible_timeout' from source: unknown 12755 1727204094.73595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204094.73866: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204094.73887: variable 'omit' from source: magic vars 12755 1727204094.73906: starting attempt loop 12755 1727204094.73913: running the handler 12755 1727204094.73933: _low_level_execute_command(): starting 12755 1727204094.73944: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204094.75291: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204094.75296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204094.75298: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204094.75395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204094.75474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204094.75631: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204094.75709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204094.77484: stdout chunk (state=3): >>>/root <<< 12755 1727204094.77794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204094.77800: stdout chunk (state=3): >>><<< 12755 1727204094.77802: stderr chunk (state=3): >>><<< 12755 1727204094.77805: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204094.77807: _low_level_execute_command(): starting 12755 1727204094.77810: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204094.777049-14057-147036943038389 `" && echo ansible-tmp-1727204094.777049-14057-147036943038389="` echo /root/.ansible/tmp/ansible-tmp-1727204094.777049-14057-147036943038389 `" ) && sleep 0' 12755 1727204094.78368: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204094.78384: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204094.78408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204094.78434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204094.78454: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204094.78467: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204094.78483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204094.78507: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12755 1727204094.78601: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204094.78628: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204094.78907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204094.80852: stdout chunk (state=3): >>>ansible-tmp-1727204094.777049-14057-147036943038389=/root/.ansible/tmp/ansible-tmp-1727204094.777049-14057-147036943038389 <<< 12755 1727204094.81202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204094.81206: stdout chunk (state=3): >>><<< 12755 1727204094.81208: stderr chunk (state=3): >>><<< 12755 1727204094.81211: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204094.777049-14057-147036943038389=/root/.ansible/tmp/ansible-tmp-1727204094.777049-14057-147036943038389 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204094.81220: variable 'ansible_module_compression' from source: unknown 12755 1727204094.81224: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 12755 1727204094.81227: ANSIBALLZ: Acquiring lock 12755 1727204094.81230: ANSIBALLZ: Lock acquired: 139630694946560 12755 1727204094.81594: ANSIBALLZ: Creating module 12755 1727204095.28235: ANSIBALLZ: Writing module into payload 12755 1727204095.28743: ANSIBALLZ: Writing module 12755 1727204095.28777: ANSIBALLZ: Renaming module 12755 1727204095.28794: ANSIBALLZ: Done creating module 12755 1727204095.28834: variable 'ansible_facts' from source: unknown 12755 1727204095.28958: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204094.777049-14057-147036943038389/AnsiballZ_network_connections.py 12755 1727204095.29114: Sending initial data 12755 1727204095.29152: Sent initial data (167 bytes) 12755 1727204095.29904: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204095.29949: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204095.29970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204095.29990: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204095.30083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204095.31833: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204095.31862: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204095.31945: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpke3yszmm /root/.ansible/tmp/ansible-tmp-1727204094.777049-14057-147036943038389/AnsiballZ_network_connections.py <<< 12755 1727204095.31949: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204094.777049-14057-147036943038389/AnsiballZ_network_connections.py" <<< 12755 1727204095.31998: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpke3yszmm" to remote "/root/.ansible/tmp/ansible-tmp-1727204094.777049-14057-147036943038389/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204094.777049-14057-147036943038389/AnsiballZ_network_connections.py" <<< 12755 1727204095.33639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204095.33679: stderr chunk (state=3): >>><<< 12755 1727204095.33692: stdout chunk (state=3): >>><<< 12755 1727204095.33782: done transferring module to remote 12755 1727204095.33786: _low_level_execute_command(): starting 12755 1727204095.33791: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204094.777049-14057-147036943038389/ /root/.ansible/tmp/ansible-tmp-1727204094.777049-14057-147036943038389/AnsiballZ_network_connections.py && sleep 0' 12755 1727204095.34407: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204095.34418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204095.34506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204095.34556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204095.34560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204095.34583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204095.34640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204095.36796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204095.36799: stdout chunk (state=3): >>><<< 12755 1727204095.36804: stderr chunk (state=3): >>><<< 12755 1727204095.36807: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204095.36809: _low_level_execute_command(): starting 12755 1727204095.36812: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204094.777049-14057-147036943038389/AnsiballZ_network_connections.py && sleep 0' 12755 1727204095.37343: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204095.37353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204095.37364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204095.37381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204095.37397: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204095.37406: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204095.37420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204095.37434: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12755 1727204095.37475: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 12755 1727204095.37478: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12755 1727204095.37481: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204095.37484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204095.37486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204095.37490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204095.37497: stderr chunk (state=3): >>>debug2: match found <<< 12755 1727204095.37508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204095.37628: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204095.37632: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204095.37635: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204095.37768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204095.84468: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, d1ad3f47-f869-4876-b6a6-dbe0ff47e776\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, d1ad3f47-f869-4876-b6a6-dbe0ff47e776 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}}<<< 12755 1727204095.84719: stdout chunk (state=3): >>> <<< 12755 1727204095.87088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204095.87125: stderr chunk (state=3): >>>Shared connection to 10.31.11.210 closed. <<< 12755 1727204095.87129: stdout chunk (state=3): >>><<< 12755 1727204095.87131: stderr chunk (state=3): >>><<< 12755 1727204095.87162: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, d1ad3f47-f869-4876-b6a6-dbe0ff47e776\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, d1ad3f47-f869-4876-b6a6-dbe0ff47e776 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204095.87271: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204094.777049-14057-147036943038389/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204095.87391: _low_level_execute_command(): starting 12755 1727204095.87396: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204094.777049-14057-147036943038389/ > /dev/null 2>&1 && sleep 0' 12755 1727204095.88033: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204095.88059: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204095.88076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204095.88175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204095.88221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204095.88240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204095.88269: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204095.88347: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204095.90584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204095.90795: stdout chunk (state=3): >>><<< 12755 1727204095.90798: stderr chunk (state=3): >>><<< 12755 1727204095.90803: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204095.90806: handler run complete 12755 1727204095.90808: attempt loop complete, returning result 12755 1727204095.90810: _execute() done 12755 1727204095.90813: dumping result to json 12755 1727204095.90815: done dumping result, returning 12755 1727204095.90829: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-72e9-1a19-000000000036] 12755 1727204095.90835: sending task result for task 12b410aa-8751-72e9-1a19-000000000036 12755 1727204095.90998: done sending task result for task 12b410aa-8751-72e9-1a19-000000000036 12755 1727204095.91001: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, d1ad3f47-f869-4876-b6a6-dbe0ff47e776 [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, d1ad3f47-f869-4876-b6a6-dbe0ff47e776 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38 (not-active) 12755 1727204095.91181: no more pending results, returning what we have 12755 1727204095.91185: results queue empty 12755 1727204095.91186: checking for any_errors_fatal 12755 1727204095.91388: done checking for any_errors_fatal 12755 1727204095.91392: checking for max_fail_percentage 12755 1727204095.91394: done checking for max_fail_percentage 12755 1727204095.91395: checking to see if all hosts have failed and the running result is not ok 12755 1727204095.91396: done checking to see if all hosts have failed 12755 1727204095.91397: getting the remaining hosts for this loop 12755 1727204095.91399: done getting the remaining hosts for this loop 12755 1727204095.91403: getting the next task for host managed-node1 12755 1727204095.91410: done getting next task for host managed-node1 12755 1727204095.91414: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12755 1727204095.91417: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204095.91430: getting variables 12755 1727204095.91432: in VariableManager get_vars() 12755 1727204095.91494: Calling all_inventory to load vars for managed-node1 12755 1727204095.91497: Calling groups_inventory to load vars for managed-node1 12755 1727204095.91501: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204095.91513: Calling all_plugins_play to load vars for managed-node1 12755 1727204095.91516: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204095.91520: Calling groups_plugins_play to load vars for managed-node1 12755 1727204095.93745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204095.96772: done with get_vars() 12755 1727204095.96818: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:54:55 -0400 (0:00:01.426) 0:00:21.205 ***** 12755 1727204095.96931: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 12755 1727204095.96933: Creating lock for fedora.linux_system_roles.network_state 12755 1727204095.97305: worker is 1 (out of 1 available) 12755 1727204095.97319: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 12755 1727204095.97445: done queuing things up, now waiting for results queue to drain 12755 1727204095.97447: waiting for pending results... 12755 1727204095.97807: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 12755 1727204095.97812: in run() - task 12b410aa-8751-72e9-1a19-000000000037 12755 1727204095.97830: variable 'ansible_search_path' from source: unknown 12755 1727204095.97835: variable 'ansible_search_path' from source: unknown 12755 1727204095.97873: calling self._execute() 12755 1727204095.97985: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204095.98001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204095.98015: variable 'omit' from source: magic vars 12755 1727204095.98569: variable 'ansible_distribution_major_version' from source: facts 12755 1727204095.98574: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204095.98797: variable 'network_state' from source: role '' defaults 12755 1727204095.98800: Evaluated conditional (network_state != {}): False 12755 1727204095.98803: when evaluation is False, skipping this task 12755 1727204095.98805: _execute() done 12755 1727204095.98808: dumping result to json 12755 1727204095.98809: done dumping result, returning 12755 1727204095.98819: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-72e9-1a19-000000000037] 12755 1727204095.98822: sending task result for task 12b410aa-8751-72e9-1a19-000000000037 12755 1727204095.98888: done sending task result for task 12b410aa-8751-72e9-1a19-000000000037 12755 1727204095.98893: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204095.98973: no more pending results, returning what we have 12755 1727204095.98977: results queue empty 12755 1727204095.98979: checking for any_errors_fatal 12755 1727204095.98994: done checking for any_errors_fatal 12755 1727204095.98995: checking for max_fail_percentage 12755 1727204095.98997: done checking for max_fail_percentage 12755 1727204095.98998: checking to see if all hosts have failed and the running result is not ok 12755 1727204095.98999: done checking to see if all hosts have failed 12755 1727204095.99000: getting the remaining hosts for this loop 12755 1727204095.99001: done getting the remaining hosts for this loop 12755 1727204095.99006: getting the next task for host managed-node1 12755 1727204095.99012: done getting next task for host managed-node1 12755 1727204095.99017: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12755 1727204095.99020: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204095.99100: getting variables 12755 1727204095.99103: in VariableManager get_vars() 12755 1727204095.99163: Calling all_inventory to load vars for managed-node1 12755 1727204095.99167: Calling groups_inventory to load vars for managed-node1 12755 1727204095.99171: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204095.99182: Calling all_plugins_play to load vars for managed-node1 12755 1727204095.99186: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204095.99192: Calling groups_plugins_play to load vars for managed-node1 12755 1727204096.01566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204096.04559: done with get_vars() 12755 1727204096.04611: done getting variables 12755 1727204096.04681: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:54:56 -0400 (0:00:00.077) 0:00:21.282 ***** 12755 1727204096.04729: entering _queue_task() for managed-node1/debug 12755 1727204096.05299: worker is 1 (out of 1 available) 12755 1727204096.05315: exiting _queue_task() for managed-node1/debug 12755 1727204096.05327: done queuing things up, now waiting for results queue to drain 12755 1727204096.05328: waiting for pending results... 12755 1727204096.05479: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12755 1727204096.05698: in run() - task 12b410aa-8751-72e9-1a19-000000000038 12755 1727204096.05704: variable 'ansible_search_path' from source: unknown 12755 1727204096.05707: variable 'ansible_search_path' from source: unknown 12755 1727204096.05723: calling self._execute() 12755 1727204096.05906: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204096.05910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204096.05913: variable 'omit' from source: magic vars 12755 1727204096.06342: variable 'ansible_distribution_major_version' from source: facts 12755 1727204096.06356: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204096.06364: variable 'omit' from source: magic vars 12755 1727204096.06448: variable 'omit' from source: magic vars 12755 1727204096.06493: variable 'omit' from source: magic vars 12755 1727204096.06537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204096.06588: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204096.06613: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204096.06636: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204096.06655: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204096.06698: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204096.06702: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204096.06706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204096.06842: Set connection var ansible_connection to ssh 12755 1727204096.06889: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204096.06892: Set connection var ansible_shell_type to sh 12755 1727204096.06896: Set connection var ansible_timeout to 10 12755 1727204096.06899: Set connection var ansible_shell_executable to /bin/sh 12755 1727204096.06901: Set connection var ansible_pipelining to False 12755 1727204096.06922: variable 'ansible_shell_executable' from source: unknown 12755 1727204096.06925: variable 'ansible_connection' from source: unknown 12755 1727204096.06929: variable 'ansible_module_compression' from source: unknown 12755 1727204096.06931: variable 'ansible_shell_type' from source: unknown 12755 1727204096.06933: variable 'ansible_shell_executable' from source: unknown 12755 1727204096.06994: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204096.06998: variable 'ansible_pipelining' from source: unknown 12755 1727204096.07001: variable 'ansible_timeout' from source: unknown 12755 1727204096.07003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204096.07139: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204096.07152: variable 'omit' from source: magic vars 12755 1727204096.07158: starting attempt loop 12755 1727204096.07162: running the handler 12755 1727204096.07337: variable '__network_connections_result' from source: set_fact 12755 1727204096.07445: handler run complete 12755 1727204096.07449: attempt loop complete, returning result 12755 1727204096.07452: _execute() done 12755 1727204096.07455: dumping result to json 12755 1727204096.07458: done dumping result, returning 12755 1727204096.07460: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-72e9-1a19-000000000038] 12755 1727204096.07463: sending task result for task 12b410aa-8751-72e9-1a19-000000000038 12755 1727204096.07568: done sending task result for task 12b410aa-8751-72e9-1a19-000000000038 12755 1727204096.07572: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, d1ad3f47-f869-4876-b6a6-dbe0ff47e776", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, d1ad3f47-f869-4876-b6a6-dbe0ff47e776 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38 (not-active)" ] } 12755 1727204096.07807: no more pending results, returning what we have 12755 1727204096.07811: results queue empty 12755 1727204096.07812: checking for any_errors_fatal 12755 1727204096.07819: done checking for any_errors_fatal 12755 1727204096.07820: checking for max_fail_percentage 12755 1727204096.07821: done checking for max_fail_percentage 12755 1727204096.07822: checking to see if all hosts have failed and the running result is not ok 12755 1727204096.07823: done checking to see if all hosts have failed 12755 1727204096.07825: getting the remaining hosts for this loop 12755 1727204096.07826: done getting the remaining hosts for this loop 12755 1727204096.07830: getting the next task for host managed-node1 12755 1727204096.07837: done getting next task for host managed-node1 12755 1727204096.07841: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12755 1727204096.07845: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204096.07858: getting variables 12755 1727204096.07860: in VariableManager get_vars() 12755 1727204096.07922: Calling all_inventory to load vars for managed-node1 12755 1727204096.07926: Calling groups_inventory to load vars for managed-node1 12755 1727204096.07929: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204096.07940: Calling all_plugins_play to load vars for managed-node1 12755 1727204096.07943: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204096.07947: Calling groups_plugins_play to load vars for managed-node1 12755 1727204096.10346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204096.13300: done with get_vars() 12755 1727204096.13345: done getting variables 12755 1727204096.13417: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:54:56 -0400 (0:00:00.087) 0:00:21.370 ***** 12755 1727204096.13471: entering _queue_task() for managed-node1/debug 12755 1727204096.13823: worker is 1 (out of 1 available) 12755 1727204096.13839: exiting _queue_task() for managed-node1/debug 12755 1727204096.13852: done queuing things up, now waiting for results queue to drain 12755 1727204096.13854: waiting for pending results... 12755 1727204096.14583: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12755 1727204096.14592: in run() - task 12b410aa-8751-72e9-1a19-000000000039 12755 1727204096.14677: variable 'ansible_search_path' from source: unknown 12755 1727204096.14681: variable 'ansible_search_path' from source: unknown 12755 1727204096.14721: calling self._execute() 12755 1727204096.14938: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204096.14946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204096.14958: variable 'omit' from source: magic vars 12755 1727204096.15923: variable 'ansible_distribution_major_version' from source: facts 12755 1727204096.15935: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204096.15942: variable 'omit' from source: magic vars 12755 1727204096.16138: variable 'omit' from source: magic vars 12755 1727204096.16319: variable 'omit' from source: magic vars 12755 1727204096.16397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204096.16402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204096.16535: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204096.16555: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204096.16570: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204096.16609: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204096.16613: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204096.16703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204096.17030: Set connection var ansible_connection to ssh 12755 1727204096.17033: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204096.17036: Set connection var ansible_shell_type to sh 12755 1727204096.17038: Set connection var ansible_timeout to 10 12755 1727204096.17040: Set connection var ansible_shell_executable to /bin/sh 12755 1727204096.17042: Set connection var ansible_pipelining to False 12755 1727204096.17044: variable 'ansible_shell_executable' from source: unknown 12755 1727204096.17095: variable 'ansible_connection' from source: unknown 12755 1727204096.17099: variable 'ansible_module_compression' from source: unknown 12755 1727204096.17303: variable 'ansible_shell_type' from source: unknown 12755 1727204096.17307: variable 'ansible_shell_executable' from source: unknown 12755 1727204096.17310: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204096.17312: variable 'ansible_pipelining' from source: unknown 12755 1727204096.17315: variable 'ansible_timeout' from source: unknown 12755 1727204096.17320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204096.17470: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204096.17483: variable 'omit' from source: magic vars 12755 1727204096.17495: starting attempt loop 12755 1727204096.17499: running the handler 12755 1727204096.17555: variable '__network_connections_result' from source: set_fact 12755 1727204096.17770: variable '__network_connections_result' from source: set_fact 12755 1727204096.17901: handler run complete 12755 1727204096.17949: attempt loop complete, returning result 12755 1727204096.17953: _execute() done 12755 1727204096.17956: dumping result to json 12755 1727204096.17964: done dumping result, returning 12755 1727204096.17983: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-72e9-1a19-000000000039] 12755 1727204096.17986: sending task result for task 12b410aa-8751-72e9-1a19-000000000039 12755 1727204096.18297: done sending task result for task 12b410aa-8751-72e9-1a19-000000000039 12755 1727204096.18301: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, d1ad3f47-f869-4876-b6a6-dbe0ff47e776\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, d1ad3f47-f869-4876-b6a6-dbe0ff47e776 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, d1ad3f47-f869-4876-b6a6-dbe0ff47e776", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, d1ad3f47-f869-4876-b6a6-dbe0ff47e776 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38 (not-active)" ] } } 12755 1727204096.18432: no more pending results, returning what we have 12755 1727204096.18436: results queue empty 12755 1727204096.18442: checking for any_errors_fatal 12755 1727204096.18448: done checking for any_errors_fatal 12755 1727204096.18449: checking for max_fail_percentage 12755 1727204096.18451: done checking for max_fail_percentage 12755 1727204096.18452: checking to see if all hosts have failed and the running result is not ok 12755 1727204096.18453: done checking to see if all hosts have failed 12755 1727204096.18454: getting the remaining hosts for this loop 12755 1727204096.18456: done getting the remaining hosts for this loop 12755 1727204096.18460: getting the next task for host managed-node1 12755 1727204096.18467: done getting next task for host managed-node1 12755 1727204096.18471: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12755 1727204096.18527: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204096.18541: getting variables 12755 1727204096.18543: in VariableManager get_vars() 12755 1727204096.18603: Calling all_inventory to load vars for managed-node1 12755 1727204096.18606: Calling groups_inventory to load vars for managed-node1 12755 1727204096.18608: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204096.18618: Calling all_plugins_play to load vars for managed-node1 12755 1727204096.18621: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204096.18624: Calling groups_plugins_play to load vars for managed-node1 12755 1727204096.21542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204096.25573: done with get_vars() 12755 1727204096.25623: done getting variables 12755 1727204096.25702: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:54:56 -0400 (0:00:00.122) 0:00:21.493 ***** 12755 1727204096.25748: entering _queue_task() for managed-node1/debug 12755 1727204096.26135: worker is 1 (out of 1 available) 12755 1727204096.26151: exiting _queue_task() for managed-node1/debug 12755 1727204096.26166: done queuing things up, now waiting for results queue to drain 12755 1727204096.26167: waiting for pending results... 12755 1727204096.26466: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12755 1727204096.26639: in run() - task 12b410aa-8751-72e9-1a19-00000000003a 12755 1727204096.26658: variable 'ansible_search_path' from source: unknown 12755 1727204096.26662: variable 'ansible_search_path' from source: unknown 12755 1727204096.26703: calling self._execute() 12755 1727204096.26896: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204096.26901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204096.26904: variable 'omit' from source: magic vars 12755 1727204096.27290: variable 'ansible_distribution_major_version' from source: facts 12755 1727204096.27305: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204096.27448: variable 'network_state' from source: role '' defaults 12755 1727204096.27459: Evaluated conditional (network_state != {}): False 12755 1727204096.27463: when evaluation is False, skipping this task 12755 1727204096.27466: _execute() done 12755 1727204096.27471: dumping result to json 12755 1727204096.27475: done dumping result, returning 12755 1727204096.27484: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-72e9-1a19-00000000003a] 12755 1727204096.27500: sending task result for task 12b410aa-8751-72e9-1a19-00000000003a 12755 1727204096.27765: done sending task result for task 12b410aa-8751-72e9-1a19-00000000003a 12755 1727204096.27769: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 12755 1727204096.27824: no more pending results, returning what we have 12755 1727204096.27829: results queue empty 12755 1727204096.27830: checking for any_errors_fatal 12755 1727204096.27840: done checking for any_errors_fatal 12755 1727204096.27841: checking for max_fail_percentage 12755 1727204096.27843: done checking for max_fail_percentage 12755 1727204096.27844: checking to see if all hosts have failed and the running result is not ok 12755 1727204096.27845: done checking to see if all hosts have failed 12755 1727204096.27846: getting the remaining hosts for this loop 12755 1727204096.27848: done getting the remaining hosts for this loop 12755 1727204096.27852: getting the next task for host managed-node1 12755 1727204096.27859: done getting next task for host managed-node1 12755 1727204096.27864: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12755 1727204096.27867: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204096.27884: getting variables 12755 1727204096.27886: in VariableManager get_vars() 12755 1727204096.28035: Calling all_inventory to load vars for managed-node1 12755 1727204096.28039: Calling groups_inventory to load vars for managed-node1 12755 1727204096.28042: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204096.28054: Calling all_plugins_play to load vars for managed-node1 12755 1727204096.28057: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204096.28061: Calling groups_plugins_play to load vars for managed-node1 12755 1727204096.30399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204096.33376: done with get_vars() 12755 1727204096.33416: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:54:56 -0400 (0:00:00.077) 0:00:21.571 ***** 12755 1727204096.33537: entering _queue_task() for managed-node1/ping 12755 1727204096.33539: Creating lock for ping 12755 1727204096.34106: worker is 1 (out of 1 available) 12755 1727204096.34120: exiting _queue_task() for managed-node1/ping 12755 1727204096.34133: done queuing things up, now waiting for results queue to drain 12755 1727204096.34135: waiting for pending results... 12755 1727204096.34280: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 12755 1727204096.34446: in run() - task 12b410aa-8751-72e9-1a19-00000000003b 12755 1727204096.34463: variable 'ansible_search_path' from source: unknown 12755 1727204096.34466: variable 'ansible_search_path' from source: unknown 12755 1727204096.34522: calling self._execute() 12755 1727204096.34632: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204096.34641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204096.34656: variable 'omit' from source: magic vars 12755 1727204096.35168: variable 'ansible_distribution_major_version' from source: facts 12755 1727204096.35171: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204096.35174: variable 'omit' from source: magic vars 12755 1727204096.35230: variable 'omit' from source: magic vars 12755 1727204096.35283: variable 'omit' from source: magic vars 12755 1727204096.35329: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204096.35386: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204096.35492: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204096.35496: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204096.35500: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204096.35502: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204096.35505: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204096.35507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204096.35626: Set connection var ansible_connection to ssh 12755 1727204096.35629: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204096.35632: Set connection var ansible_shell_type to sh 12755 1727204096.35649: Set connection var ansible_timeout to 10 12755 1727204096.35656: Set connection var ansible_shell_executable to /bin/sh 12755 1727204096.35664: Set connection var ansible_pipelining to False 12755 1727204096.35701: variable 'ansible_shell_executable' from source: unknown 12755 1727204096.35704: variable 'ansible_connection' from source: unknown 12755 1727204096.35707: variable 'ansible_module_compression' from source: unknown 12755 1727204096.35710: variable 'ansible_shell_type' from source: unknown 12755 1727204096.35713: variable 'ansible_shell_executable' from source: unknown 12755 1727204096.35733: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204096.35736: variable 'ansible_pipelining' from source: unknown 12755 1727204096.35739: variable 'ansible_timeout' from source: unknown 12755 1727204096.35742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204096.36062: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204096.36067: variable 'omit' from source: magic vars 12755 1727204096.36070: starting attempt loop 12755 1727204096.36072: running the handler 12755 1727204096.36074: _low_level_execute_command(): starting 12755 1727204096.36077: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204096.36894: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204096.36932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204096.36948: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204096.36954: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204096.37033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204096.38809: stdout chunk (state=3): >>>/root <<< 12755 1727204096.38929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204096.38999: stderr chunk (state=3): >>><<< 12755 1727204096.39027: stdout chunk (state=3): >>><<< 12755 1727204096.39049: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204096.39137: _low_level_execute_command(): starting 12755 1727204096.39141: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204096.39056-14199-223715206831167 `" && echo ansible-tmp-1727204096.39056-14199-223715206831167="` echo /root/.ansible/tmp/ansible-tmp-1727204096.39056-14199-223715206831167 `" ) && sleep 0' 12755 1727204096.39729: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204096.39744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204096.39809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204096.39895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204096.39923: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204096.40018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204096.42113: stdout chunk (state=3): >>>ansible-tmp-1727204096.39056-14199-223715206831167=/root/.ansible/tmp/ansible-tmp-1727204096.39056-14199-223715206831167 <<< 12755 1727204096.42327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204096.42331: stdout chunk (state=3): >>><<< 12755 1727204096.42333: stderr chunk (state=3): >>><<< 12755 1727204096.42496: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204096.39056-14199-223715206831167=/root/.ansible/tmp/ansible-tmp-1727204096.39056-14199-223715206831167 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204096.42500: variable 'ansible_module_compression' from source: unknown 12755 1727204096.42502: ANSIBALLZ: Using lock for ping 12755 1727204096.42504: ANSIBALLZ: Acquiring lock 12755 1727204096.42507: ANSIBALLZ: Lock acquired: 139630693869632 12755 1727204096.42509: ANSIBALLZ: Creating module 12755 1727204096.59471: ANSIBALLZ: Writing module into payload 12755 1727204096.59700: ANSIBALLZ: Writing module 12755 1727204096.59703: ANSIBALLZ: Renaming module 12755 1727204096.59706: ANSIBALLZ: Done creating module 12755 1727204096.59708: variable 'ansible_facts' from source: unknown 12755 1727204096.59710: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204096.39056-14199-223715206831167/AnsiballZ_ping.py 12755 1727204096.59946: Sending initial data 12755 1727204096.59951: Sent initial data (151 bytes) 12755 1727204096.60491: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204096.60595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204096.60603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204096.60633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204096.60708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204096.62459: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204096.62524: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204096.62575: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmppyr23f6r /root/.ansible/tmp/ansible-tmp-1727204096.39056-14199-223715206831167/AnsiballZ_ping.py <<< 12755 1727204096.62593: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204096.39056-14199-223715206831167/AnsiballZ_ping.py" <<< 12755 1727204096.62637: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmppyr23f6r" to remote "/root/.ansible/tmp/ansible-tmp-1727204096.39056-14199-223715206831167/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204096.39056-14199-223715206831167/AnsiballZ_ping.py" <<< 12755 1727204096.63796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204096.63829: stderr chunk (state=3): >>><<< 12755 1727204096.63841: stdout chunk (state=3): >>><<< 12755 1727204096.63872: done transferring module to remote 12755 1727204096.63906: _low_level_execute_command(): starting 12755 1727204096.63919: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204096.39056-14199-223715206831167/ /root/.ansible/tmp/ansible-tmp-1727204096.39056-14199-223715206831167/AnsiballZ_ping.py && sleep 0' 12755 1727204096.64674: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204096.64693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204096.64710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204096.64780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204096.64841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204096.64860: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204096.64908: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204096.64956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204096.67185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204096.67212: stderr chunk (state=3): >>><<< 12755 1727204096.67215: stdout chunk (state=3): >>><<< 12755 1727204096.67435: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204096.67438: _low_level_execute_command(): starting 12755 1727204096.67441: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204096.39056-14199-223715206831167/AnsiballZ_ping.py && sleep 0' 12755 1727204096.68254: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204096.68273: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204096.68323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204096.68340: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12755 1727204096.68437: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204096.68468: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204096.68487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204096.68569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204096.86641: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 12755 1727204096.88398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204096.88403: stdout chunk (state=3): >>><<< 12755 1727204096.88405: stderr chunk (state=3): >>><<< 12755 1727204096.88408: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204096.88411: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204096.39056-14199-223715206831167/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204096.88413: _low_level_execute_command(): starting 12755 1727204096.88416: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204096.39056-14199-223715206831167/ > /dev/null 2>&1 && sleep 0' 12755 1727204096.89747: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204096.89751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204096.89907: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204096.90084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204096.90093: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204096.90168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204096.92203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204096.92371: stderr chunk (state=3): >>><<< 12755 1727204096.92382: stdout chunk (state=3): >>><<< 12755 1727204096.92410: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204096.92579: handler run complete 12755 1727204096.92583: attempt loop complete, returning result 12755 1727204096.92586: _execute() done 12755 1727204096.92588: dumping result to json 12755 1727204096.92594: done dumping result, returning 12755 1727204096.92596: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-72e9-1a19-00000000003b] 12755 1727204096.92598: sending task result for task 12b410aa-8751-72e9-1a19-00000000003b 12755 1727204096.92674: done sending task result for task 12b410aa-8751-72e9-1a19-00000000003b ok: [managed-node1] => { "changed": false, "ping": "pong" } 12755 1727204096.92760: no more pending results, returning what we have 12755 1727204096.92765: results queue empty 12755 1727204096.92767: checking for any_errors_fatal 12755 1727204096.92775: done checking for any_errors_fatal 12755 1727204096.92776: checking for max_fail_percentage 12755 1727204096.92778: done checking for max_fail_percentage 12755 1727204096.92779: checking to see if all hosts have failed and the running result is not ok 12755 1727204096.92780: done checking to see if all hosts have failed 12755 1727204096.92781: getting the remaining hosts for this loop 12755 1727204096.92782: done getting the remaining hosts for this loop 12755 1727204096.92788: getting the next task for host managed-node1 12755 1727204096.93104: done getting next task for host managed-node1 12755 1727204096.93107: ^ task is: TASK: meta (role_complete) 12755 1727204096.93111: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204096.93128: getting variables 12755 1727204096.93130: in VariableManager get_vars() 12755 1727204096.93203: Calling all_inventory to load vars for managed-node1 12755 1727204096.93207: Calling groups_inventory to load vars for managed-node1 12755 1727204096.93211: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204096.93228: Calling all_plugins_play to load vars for managed-node1 12755 1727204096.93234: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204096.93239: Calling groups_plugins_play to load vars for managed-node1 12755 1727204096.94138: WORKER PROCESS EXITING 12755 1727204096.96491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204097.00583: done with get_vars() 12755 1727204097.00645: done getting variables 12755 1727204097.00757: done queuing things up, now waiting for results queue to drain 12755 1727204097.00760: results queue empty 12755 1727204097.00761: checking for any_errors_fatal 12755 1727204097.00765: done checking for any_errors_fatal 12755 1727204097.00766: checking for max_fail_percentage 12755 1727204097.00767: done checking for max_fail_percentage 12755 1727204097.00768: checking to see if all hosts have failed and the running result is not ok 12755 1727204097.00769: done checking to see if all hosts have failed 12755 1727204097.00770: getting the remaining hosts for this loop 12755 1727204097.00771: done getting the remaining hosts for this loop 12755 1727204097.00774: getting the next task for host managed-node1 12755 1727204097.00780: done getting next task for host managed-node1 12755 1727204097.00782: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12755 1727204097.00784: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204097.00787: getting variables 12755 1727204097.00788: in VariableManager get_vars() 12755 1727204097.00816: Calling all_inventory to load vars for managed-node1 12755 1727204097.00821: Calling groups_inventory to load vars for managed-node1 12755 1727204097.00824: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204097.00834: Calling all_plugins_play to load vars for managed-node1 12755 1727204097.00837: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204097.00840: Calling groups_plugins_play to load vars for managed-node1 12755 1727204097.04527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204097.09039: done with get_vars() 12755 1727204097.09098: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:57 -0400 (0:00:00.756) 0:00:22.328 ***** 12755 1727204097.09231: entering _queue_task() for managed-node1/include_tasks 12755 1727204097.09715: worker is 1 (out of 1 available) 12755 1727204097.09735: exiting _queue_task() for managed-node1/include_tasks 12755 1727204097.09749: done queuing things up, now waiting for results queue to drain 12755 1727204097.09750: waiting for pending results... 12755 1727204097.09980: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 12755 1727204097.10198: in run() - task 12b410aa-8751-72e9-1a19-00000000006e 12755 1727204097.10214: variable 'ansible_search_path' from source: unknown 12755 1727204097.10218: variable 'ansible_search_path' from source: unknown 12755 1727204097.10262: calling self._execute() 12755 1727204097.10445: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204097.10454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204097.10470: variable 'omit' from source: magic vars 12755 1727204097.11009: variable 'ansible_distribution_major_version' from source: facts 12755 1727204097.11025: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204097.11034: _execute() done 12755 1727204097.11037: dumping result to json 12755 1727204097.11048: done dumping result, returning 12755 1727204097.11058: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-72e9-1a19-00000000006e] 12755 1727204097.11063: sending task result for task 12b410aa-8751-72e9-1a19-00000000006e 12755 1727204097.11282: done sending task result for task 12b410aa-8751-72e9-1a19-00000000006e 12755 1727204097.11286: WORKER PROCESS EXITING 12755 1727204097.11315: no more pending results, returning what we have 12755 1727204097.11322: in VariableManager get_vars() 12755 1727204097.11378: Calling all_inventory to load vars for managed-node1 12755 1727204097.11382: Calling groups_inventory to load vars for managed-node1 12755 1727204097.11385: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204097.11401: Calling all_plugins_play to load vars for managed-node1 12755 1727204097.11405: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204097.11409: Calling groups_plugins_play to load vars for managed-node1 12755 1727204097.19499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204097.22514: done with get_vars() 12755 1727204097.22563: variable 'ansible_search_path' from source: unknown 12755 1727204097.22565: variable 'ansible_search_path' from source: unknown 12755 1727204097.22612: we have included files to process 12755 1727204097.22614: generating all_blocks data 12755 1727204097.22620: done generating all_blocks data 12755 1727204097.22624: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12755 1727204097.22625: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12755 1727204097.22633: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12755 1727204097.22862: done processing included file 12755 1727204097.22865: iterating over new_blocks loaded from include file 12755 1727204097.22867: in VariableManager get_vars() 12755 1727204097.22908: done with get_vars() 12755 1727204097.22910: filtering new block on tags 12755 1727204097.22935: done filtering new block on tags 12755 1727204097.22939: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 12755 1727204097.22944: extending task lists for all hosts with included blocks 12755 1727204097.23094: done extending task lists 12755 1727204097.23095: done processing included files 12755 1727204097.23096: results queue empty 12755 1727204097.23097: checking for any_errors_fatal 12755 1727204097.23099: done checking for any_errors_fatal 12755 1727204097.23101: checking for max_fail_percentage 12755 1727204097.23102: done checking for max_fail_percentage 12755 1727204097.23103: checking to see if all hosts have failed and the running result is not ok 12755 1727204097.23104: done checking to see if all hosts have failed 12755 1727204097.23105: getting the remaining hosts for this loop 12755 1727204097.23106: done getting the remaining hosts for this loop 12755 1727204097.23109: getting the next task for host managed-node1 12755 1727204097.23114: done getting next task for host managed-node1 12755 1727204097.23119: ^ task is: TASK: Get stat for interface {{ interface }} 12755 1727204097.23122: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204097.23125: getting variables 12755 1727204097.23126: in VariableManager get_vars() 12755 1727204097.23152: Calling all_inventory to load vars for managed-node1 12755 1727204097.23155: Calling groups_inventory to load vars for managed-node1 12755 1727204097.23157: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204097.23164: Calling all_plugins_play to load vars for managed-node1 12755 1727204097.23167: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204097.23171: Calling groups_plugins_play to load vars for managed-node1 12755 1727204097.25860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204097.30488: done with get_vars() 12755 1727204097.30534: done getting variables 12755 1727204097.30730: variable 'interface' from source: task vars 12755 1727204097.30735: variable 'controller_device' from source: play vars 12755 1727204097.30814: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:54:57 -0400 (0:00:00.216) 0:00:22.544 ***** 12755 1727204097.30856: entering _queue_task() for managed-node1/stat 12755 1727204097.31234: worker is 1 (out of 1 available) 12755 1727204097.31248: exiting _queue_task() for managed-node1/stat 12755 1727204097.31268: done queuing things up, now waiting for results queue to drain 12755 1727204097.31270: waiting for pending results... 12755 1727204097.31571: running TaskExecutor() for managed-node1/TASK: Get stat for interface nm-bond 12755 1727204097.31747: in run() - task 12b410aa-8751-72e9-1a19-000000000337 12755 1727204097.31772: variable 'ansible_search_path' from source: unknown 12755 1727204097.31783: variable 'ansible_search_path' from source: unknown 12755 1727204097.31896: calling self._execute() 12755 1727204097.31970: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204097.31987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204097.32009: variable 'omit' from source: magic vars 12755 1727204097.32481: variable 'ansible_distribution_major_version' from source: facts 12755 1727204097.32503: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204097.32517: variable 'omit' from source: magic vars 12755 1727204097.32606: variable 'omit' from source: magic vars 12755 1727204097.32731: variable 'interface' from source: task vars 12755 1727204097.32799: variable 'controller_device' from source: play vars 12755 1727204097.32831: variable 'controller_device' from source: play vars 12755 1727204097.32857: variable 'omit' from source: magic vars 12755 1727204097.32916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204097.32963: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204097.32994: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204097.33030: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204097.33050: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204097.33093: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204097.33127: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204097.33131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204097.33243: Set connection var ansible_connection to ssh 12755 1727204097.33294: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204097.33298: Set connection var ansible_shell_type to sh 12755 1727204097.33302: Set connection var ansible_timeout to 10 12755 1727204097.33304: Set connection var ansible_shell_executable to /bin/sh 12755 1727204097.33308: Set connection var ansible_pipelining to False 12755 1727204097.33333: variable 'ansible_shell_executable' from source: unknown 12755 1727204097.33347: variable 'ansible_connection' from source: unknown 12755 1727204097.33355: variable 'ansible_module_compression' from source: unknown 12755 1727204097.33364: variable 'ansible_shell_type' from source: unknown 12755 1727204097.33451: variable 'ansible_shell_executable' from source: unknown 12755 1727204097.33455: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204097.33457: variable 'ansible_pipelining' from source: unknown 12755 1727204097.33460: variable 'ansible_timeout' from source: unknown 12755 1727204097.33462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204097.33644: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204097.33666: variable 'omit' from source: magic vars 12755 1727204097.33680: starting attempt loop 12755 1727204097.33688: running the handler 12755 1727204097.33710: _low_level_execute_command(): starting 12755 1727204097.33725: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204097.34564: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204097.34595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204097.34615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204097.34738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204097.34888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204097.36632: stdout chunk (state=3): >>>/root <<< 12755 1727204097.36908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204097.36912: stdout chunk (state=3): >>><<< 12755 1727204097.36915: stderr chunk (state=3): >>><<< 12755 1727204097.36997: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204097.37002: _low_level_execute_command(): starting 12755 1727204097.37005: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204097.369455-14239-51505019521731 `" && echo ansible-tmp-1727204097.369455-14239-51505019521731="` echo /root/.ansible/tmp/ansible-tmp-1727204097.369455-14239-51505019521731 `" ) && sleep 0' 12755 1727204097.37678: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204097.37697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204097.37711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204097.37767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204097.37871: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204097.37898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204097.38110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204097.40098: stdout chunk (state=3): >>>ansible-tmp-1727204097.369455-14239-51505019521731=/root/.ansible/tmp/ansible-tmp-1727204097.369455-14239-51505019521731 <<< 12755 1727204097.40298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204097.40307: stdout chunk (state=3): >>><<< 12755 1727204097.40310: stderr chunk (state=3): >>><<< 12755 1727204097.40539: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204097.369455-14239-51505019521731=/root/.ansible/tmp/ansible-tmp-1727204097.369455-14239-51505019521731 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204097.40543: variable 'ansible_module_compression' from source: unknown 12755 1727204097.40575: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12755 1727204097.40695: variable 'ansible_facts' from source: unknown 12755 1727204097.40917: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204097.369455-14239-51505019521731/AnsiballZ_stat.py 12755 1727204097.41531: Sending initial data 12755 1727204097.41544: Sent initial data (151 bytes) 12755 1727204097.42706: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204097.42912: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204097.42955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204097.44819: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12755 1727204097.44846: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204097.44950: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204097.45053: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmp3ss1tmrc /root/.ansible/tmp/ansible-tmp-1727204097.369455-14239-51505019521731/AnsiballZ_stat.py <<< 12755 1727204097.45056: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204097.369455-14239-51505019521731/AnsiballZ_stat.py" <<< 12755 1727204097.45250: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmp3ss1tmrc" to remote "/root/.ansible/tmp/ansible-tmp-1727204097.369455-14239-51505019521731/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204097.369455-14239-51505019521731/AnsiballZ_stat.py" <<< 12755 1727204097.47415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204097.47587: stderr chunk (state=3): >>><<< 12755 1727204097.47593: stdout chunk (state=3): >>><<< 12755 1727204097.47596: done transferring module to remote 12755 1727204097.47598: _low_level_execute_command(): starting 12755 1727204097.47601: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204097.369455-14239-51505019521731/ /root/.ansible/tmp/ansible-tmp-1727204097.369455-14239-51505019521731/AnsiballZ_stat.py && sleep 0' 12755 1727204097.48767: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204097.48778: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204097.48792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204097.48811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204097.48853: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204097.49103: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204097.49181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204097.51207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204097.51211: stdout chunk (state=3): >>><<< 12755 1727204097.51298: stderr chunk (state=3): >>><<< 12755 1727204097.51413: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204097.51421: _low_level_execute_command(): starting 12755 1727204097.51424: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204097.369455-14239-51505019521731/AnsiballZ_stat.py && sleep 0' 12755 1727204097.52561: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204097.52682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204097.52904: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204097.52908: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204097.52923: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204097.72175: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 36174, "dev": 23, "nlink": 1, "atime": 1727204095.682173, "mtime": 1727204095.682173, "ctime": 1727204095.682173, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12755 1727204097.74097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204097.74101: stdout chunk (state=3): >>><<< 12755 1727204097.74104: stderr chunk (state=3): >>><<< 12755 1727204097.74107: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 36174, "dev": 23, "nlink": 1, "atime": 1727204095.682173, "mtime": 1727204095.682173, "ctime": 1727204095.682173, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204097.74110: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204097.369455-14239-51505019521731/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204097.74113: _low_level_execute_command(): starting 12755 1727204097.74115: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204097.369455-14239-51505019521731/ > /dev/null 2>&1 && sleep 0' 12755 1727204097.74620: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204097.74636: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204097.74649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204097.74667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204097.74681: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204097.74696: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204097.74797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204097.74843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204097.74883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204097.76920: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204097.76977: stderr chunk (state=3): >>><<< 12755 1727204097.76980: stdout chunk (state=3): >>><<< 12755 1727204097.76983: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204097.76992: handler run complete 12755 1727204097.77035: attempt loop complete, returning result 12755 1727204097.77039: _execute() done 12755 1727204097.77041: dumping result to json 12755 1727204097.77049: done dumping result, returning 12755 1727204097.77057: done running TaskExecutor() for managed-node1/TASK: Get stat for interface nm-bond [12b410aa-8751-72e9-1a19-000000000337] 12755 1727204097.77063: sending task result for task 12b410aa-8751-72e9-1a19-000000000337 12755 1727204097.77177: done sending task result for task 12b410aa-8751-72e9-1a19-000000000337 12755 1727204097.77179: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727204095.682173, "block_size": 4096, "blocks": 0, "ctime": 1727204095.682173, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 36174, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1727204095.682173, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 12755 1727204097.77304: no more pending results, returning what we have 12755 1727204097.77308: results queue empty 12755 1727204097.77309: checking for any_errors_fatal 12755 1727204097.77310: done checking for any_errors_fatal 12755 1727204097.77311: checking for max_fail_percentage 12755 1727204097.77312: done checking for max_fail_percentage 12755 1727204097.77313: checking to see if all hosts have failed and the running result is not ok 12755 1727204097.77315: done checking to see if all hosts have failed 12755 1727204097.77315: getting the remaining hosts for this loop 12755 1727204097.77317: done getting the remaining hosts for this loop 12755 1727204097.77322: getting the next task for host managed-node1 12755 1727204097.77331: done getting next task for host managed-node1 12755 1727204097.77333: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 12755 1727204097.77336: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204097.77340: getting variables 12755 1727204097.77342: in VariableManager get_vars() 12755 1727204097.77398: Calling all_inventory to load vars for managed-node1 12755 1727204097.77402: Calling groups_inventory to load vars for managed-node1 12755 1727204097.77405: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204097.77416: Calling all_plugins_play to load vars for managed-node1 12755 1727204097.77419: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204097.77423: Calling groups_plugins_play to load vars for managed-node1 12755 1727204097.78722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204097.80281: done with get_vars() 12755 1727204097.80306: done getting variables 12755 1727204097.80356: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204097.80456: variable 'interface' from source: task vars 12755 1727204097.80460: variable 'controller_device' from source: play vars 12755 1727204097.80510: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:57 -0400 (0:00:00.496) 0:00:23.041 ***** 12755 1727204097.80537: entering _queue_task() for managed-node1/assert 12755 1727204097.80777: worker is 1 (out of 1 available) 12755 1727204097.80795: exiting _queue_task() for managed-node1/assert 12755 1727204097.80808: done queuing things up, now waiting for results queue to drain 12755 1727204097.80809: waiting for pending results... 12755 1727204097.81000: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'nm-bond' 12755 1727204097.81093: in run() - task 12b410aa-8751-72e9-1a19-00000000006f 12755 1727204097.81106: variable 'ansible_search_path' from source: unknown 12755 1727204097.81110: variable 'ansible_search_path' from source: unknown 12755 1727204097.81150: calling self._execute() 12755 1727204097.81232: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204097.81238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204097.81256: variable 'omit' from source: magic vars 12755 1727204097.81561: variable 'ansible_distribution_major_version' from source: facts 12755 1727204097.81573: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204097.81582: variable 'omit' from source: magic vars 12755 1727204097.81625: variable 'omit' from source: magic vars 12755 1727204097.81703: variable 'interface' from source: task vars 12755 1727204097.81708: variable 'controller_device' from source: play vars 12755 1727204097.81762: variable 'controller_device' from source: play vars 12755 1727204097.81779: variable 'omit' from source: magic vars 12755 1727204097.81820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204097.81853: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204097.81872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204097.81890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204097.81907: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204097.81937: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204097.81941: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204097.81944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204097.82032: Set connection var ansible_connection to ssh 12755 1727204097.82039: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204097.82042: Set connection var ansible_shell_type to sh 12755 1727204097.82053: Set connection var ansible_timeout to 10 12755 1727204097.82059: Set connection var ansible_shell_executable to /bin/sh 12755 1727204097.82066: Set connection var ansible_pipelining to False 12755 1727204097.82086: variable 'ansible_shell_executable' from source: unknown 12755 1727204097.82091: variable 'ansible_connection' from source: unknown 12755 1727204097.82093: variable 'ansible_module_compression' from source: unknown 12755 1727204097.82098: variable 'ansible_shell_type' from source: unknown 12755 1727204097.82102: variable 'ansible_shell_executable' from source: unknown 12755 1727204097.82107: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204097.82111: variable 'ansible_pipelining' from source: unknown 12755 1727204097.82114: variable 'ansible_timeout' from source: unknown 12755 1727204097.82123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204097.82246: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204097.82255: variable 'omit' from source: magic vars 12755 1727204097.82261: starting attempt loop 12755 1727204097.82264: running the handler 12755 1727204097.82379: variable 'interface_stat' from source: set_fact 12755 1727204097.82398: Evaluated conditional (interface_stat.stat.exists): True 12755 1727204097.82406: handler run complete 12755 1727204097.82419: attempt loop complete, returning result 12755 1727204097.82425: _execute() done 12755 1727204097.82428: dumping result to json 12755 1727204097.82433: done dumping result, returning 12755 1727204097.82440: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'nm-bond' [12b410aa-8751-72e9-1a19-00000000006f] 12755 1727204097.82452: sending task result for task 12b410aa-8751-72e9-1a19-00000000006f 12755 1727204097.82542: done sending task result for task 12b410aa-8751-72e9-1a19-00000000006f 12755 1727204097.82545: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12755 1727204097.82628: no more pending results, returning what we have 12755 1727204097.82632: results queue empty 12755 1727204097.82633: checking for any_errors_fatal 12755 1727204097.82641: done checking for any_errors_fatal 12755 1727204097.82642: checking for max_fail_percentage 12755 1727204097.82644: done checking for max_fail_percentage 12755 1727204097.82645: checking to see if all hosts have failed and the running result is not ok 12755 1727204097.82646: done checking to see if all hosts have failed 12755 1727204097.82647: getting the remaining hosts for this loop 12755 1727204097.82648: done getting the remaining hosts for this loop 12755 1727204097.82653: getting the next task for host managed-node1 12755 1727204097.82661: done getting next task for host managed-node1 12755 1727204097.82667: ^ task is: TASK: Include the task 'assert_profile_present.yml' 12755 1727204097.82669: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204097.82672: getting variables 12755 1727204097.82674: in VariableManager get_vars() 12755 1727204097.82726: Calling all_inventory to load vars for managed-node1 12755 1727204097.82729: Calling groups_inventory to load vars for managed-node1 12755 1727204097.82732: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204097.82743: Calling all_plugins_play to load vars for managed-node1 12755 1727204097.82746: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204097.82750: Calling groups_plugins_play to load vars for managed-node1 12755 1727204097.84050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204097.85601: done with get_vars() 12755 1727204097.85622: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:67 Tuesday 24 September 2024 14:54:57 -0400 (0:00:00.051) 0:00:23.092 ***** 12755 1727204097.85693: entering _queue_task() for managed-node1/include_tasks 12755 1727204097.85919: worker is 1 (out of 1 available) 12755 1727204097.85934: exiting _queue_task() for managed-node1/include_tasks 12755 1727204097.85948: done queuing things up, now waiting for results queue to drain 12755 1727204097.85950: waiting for pending results... 12755 1727204097.86129: running TaskExecutor() for managed-node1/TASK: Include the task 'assert_profile_present.yml' 12755 1727204097.86209: in run() - task 12b410aa-8751-72e9-1a19-000000000070 12755 1727204097.86222: variable 'ansible_search_path' from source: unknown 12755 1727204097.86265: variable 'controller_profile' from source: play vars 12755 1727204097.86426: variable 'controller_profile' from source: play vars 12755 1727204097.86439: variable 'port1_profile' from source: play vars 12755 1727204097.86495: variable 'port1_profile' from source: play vars 12755 1727204097.86503: variable 'port2_profile' from source: play vars 12755 1727204097.86560: variable 'port2_profile' from source: play vars 12755 1727204097.86573: variable 'omit' from source: magic vars 12755 1727204097.86690: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204097.86702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204097.86715: variable 'omit' from source: magic vars 12755 1727204097.86920: variable 'ansible_distribution_major_version' from source: facts 12755 1727204097.86931: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204097.86960: variable 'item' from source: unknown 12755 1727204097.87013: variable 'item' from source: unknown 12755 1727204097.87135: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204097.87139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204097.87142: variable 'omit' from source: magic vars 12755 1727204097.87261: variable 'ansible_distribution_major_version' from source: facts 12755 1727204097.87267: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204097.87293: variable 'item' from source: unknown 12755 1727204097.87344: variable 'item' from source: unknown 12755 1727204097.87424: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204097.87433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204097.87443: variable 'omit' from source: magic vars 12755 1727204097.87568: variable 'ansible_distribution_major_version' from source: facts 12755 1727204097.87572: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204097.87598: variable 'item' from source: unknown 12755 1727204097.87651: variable 'item' from source: unknown 12755 1727204097.87721: dumping result to json 12755 1727204097.87725: done dumping result, returning 12755 1727204097.87728: done running TaskExecutor() for managed-node1/TASK: Include the task 'assert_profile_present.yml' [12b410aa-8751-72e9-1a19-000000000070] 12755 1727204097.87730: sending task result for task 12b410aa-8751-72e9-1a19-000000000070 12755 1727204097.87773: done sending task result for task 12b410aa-8751-72e9-1a19-000000000070 12755 1727204097.87776: WORKER PROCESS EXITING 12755 1727204097.87817: no more pending results, returning what we have 12755 1727204097.87822: in VariableManager get_vars() 12755 1727204097.87878: Calling all_inventory to load vars for managed-node1 12755 1727204097.87882: Calling groups_inventory to load vars for managed-node1 12755 1727204097.87884: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204097.87903: Calling all_plugins_play to load vars for managed-node1 12755 1727204097.87906: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204097.87910: Calling groups_plugins_play to load vars for managed-node1 12755 1727204097.89071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204097.92883: done with get_vars() 12755 1727204097.92916: variable 'ansible_search_path' from source: unknown 12755 1727204097.92935: variable 'ansible_search_path' from source: unknown 12755 1727204097.92947: variable 'ansible_search_path' from source: unknown 12755 1727204097.92956: we have included files to process 12755 1727204097.92958: generating all_blocks data 12755 1727204097.92959: done generating all_blocks data 12755 1727204097.92964: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12755 1727204097.92966: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12755 1727204097.92969: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12755 1727204097.93220: in VariableManager get_vars() 12755 1727204097.93264: done with get_vars() 12755 1727204097.93588: done processing included file 12755 1727204097.93592: iterating over new_blocks loaded from include file 12755 1727204097.93594: in VariableManager get_vars() 12755 1727204097.93628: done with get_vars() 12755 1727204097.93631: filtering new block on tags 12755 1727204097.93658: done filtering new block on tags 12755 1727204097.93662: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node1 => (item=bond0) 12755 1727204097.93667: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12755 1727204097.93668: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12755 1727204097.93672: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12755 1727204097.93806: in VariableManager get_vars() 12755 1727204097.93844: done with get_vars() 12755 1727204097.94141: done processing included file 12755 1727204097.94143: iterating over new_blocks loaded from include file 12755 1727204097.94145: in VariableManager get_vars() 12755 1727204097.94177: done with get_vars() 12755 1727204097.94179: filtering new block on tags 12755 1727204097.94204: done filtering new block on tags 12755 1727204097.94207: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node1 => (item=bond0.0) 12755 1727204097.94211: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12755 1727204097.94213: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12755 1727204097.94216: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12755 1727204097.94350: in VariableManager get_vars() 12755 1727204097.94463: done with get_vars() 12755 1727204097.95005: done processing included file 12755 1727204097.95008: iterating over new_blocks loaded from include file 12755 1727204097.95009: in VariableManager get_vars() 12755 1727204097.95043: done with get_vars() 12755 1727204097.95046: filtering new block on tags 12755 1727204097.95071: done filtering new block on tags 12755 1727204097.95075: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node1 => (item=bond0.1) 12755 1727204097.95079: extending task lists for all hosts with included blocks 12755 1727204098.00195: done extending task lists 12755 1727204098.00201: done processing included files 12755 1727204098.00202: results queue empty 12755 1727204098.00203: checking for any_errors_fatal 12755 1727204098.00207: done checking for any_errors_fatal 12755 1727204098.00207: checking for max_fail_percentage 12755 1727204098.00208: done checking for max_fail_percentage 12755 1727204098.00209: checking to see if all hosts have failed and the running result is not ok 12755 1727204098.00209: done checking to see if all hosts have failed 12755 1727204098.00210: getting the remaining hosts for this loop 12755 1727204098.00211: done getting the remaining hosts for this loop 12755 1727204098.00213: getting the next task for host managed-node1 12755 1727204098.00218: done getting next task for host managed-node1 12755 1727204098.00220: ^ task is: TASK: Include the task 'get_profile_stat.yml' 12755 1727204098.00222: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204098.00224: getting variables 12755 1727204098.00225: in VariableManager get_vars() 12755 1727204098.00243: Calling all_inventory to load vars for managed-node1 12755 1727204098.00245: Calling groups_inventory to load vars for managed-node1 12755 1727204098.00247: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204098.00252: Calling all_plugins_play to load vars for managed-node1 12755 1727204098.00254: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204098.00256: Calling groups_plugins_play to load vars for managed-node1 12755 1727204098.01344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204098.03025: done with get_vars() 12755 1727204098.03056: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:54:58 -0400 (0:00:00.174) 0:00:23.266 ***** 12755 1727204098.03125: entering _queue_task() for managed-node1/include_tasks 12755 1727204098.03404: worker is 1 (out of 1 available) 12755 1727204098.03422: exiting _queue_task() for managed-node1/include_tasks 12755 1727204098.03436: done queuing things up, now waiting for results queue to drain 12755 1727204098.03437: waiting for pending results... 12755 1727204098.03623: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 12755 1727204098.03695: in run() - task 12b410aa-8751-72e9-1a19-000000000355 12755 1727204098.03707: variable 'ansible_search_path' from source: unknown 12755 1727204098.03712: variable 'ansible_search_path' from source: unknown 12755 1727204098.03744: calling self._execute() 12755 1727204098.03825: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204098.03833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204098.03842: variable 'omit' from source: magic vars 12755 1727204098.04296: variable 'ansible_distribution_major_version' from source: facts 12755 1727204098.04299: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204098.04302: _execute() done 12755 1727204098.04304: dumping result to json 12755 1727204098.04306: done dumping result, returning 12755 1727204098.04308: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [12b410aa-8751-72e9-1a19-000000000355] 12755 1727204098.04310: sending task result for task 12b410aa-8751-72e9-1a19-000000000355 12755 1727204098.04377: done sending task result for task 12b410aa-8751-72e9-1a19-000000000355 12755 1727204098.04380: WORKER PROCESS EXITING 12755 1727204098.04412: no more pending results, returning what we have 12755 1727204098.04420: in VariableManager get_vars() 12755 1727204098.04474: Calling all_inventory to load vars for managed-node1 12755 1727204098.04477: Calling groups_inventory to load vars for managed-node1 12755 1727204098.04480: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204098.04544: Calling all_plugins_play to load vars for managed-node1 12755 1727204098.04549: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204098.04553: Calling groups_plugins_play to load vars for managed-node1 12755 1727204098.06436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204098.08010: done with get_vars() 12755 1727204098.08031: variable 'ansible_search_path' from source: unknown 12755 1727204098.08032: variable 'ansible_search_path' from source: unknown 12755 1727204098.08065: we have included files to process 12755 1727204098.08066: generating all_blocks data 12755 1727204098.08067: done generating all_blocks data 12755 1727204098.08068: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12755 1727204098.08069: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12755 1727204098.08071: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12755 1727204098.08901: done processing included file 12755 1727204098.08903: iterating over new_blocks loaded from include file 12755 1727204098.08904: in VariableManager get_vars() 12755 1727204098.08928: done with get_vars() 12755 1727204098.08930: filtering new block on tags 12755 1727204098.08949: done filtering new block on tags 12755 1727204098.08952: in VariableManager get_vars() 12755 1727204098.08972: done with get_vars() 12755 1727204098.08973: filtering new block on tags 12755 1727204098.08991: done filtering new block on tags 12755 1727204098.08993: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 12755 1727204098.08997: extending task lists for all hosts with included blocks 12755 1727204098.09136: done extending task lists 12755 1727204098.09137: done processing included files 12755 1727204098.09138: results queue empty 12755 1727204098.09138: checking for any_errors_fatal 12755 1727204098.09141: done checking for any_errors_fatal 12755 1727204098.09142: checking for max_fail_percentage 12755 1727204098.09143: done checking for max_fail_percentage 12755 1727204098.09144: checking to see if all hosts have failed and the running result is not ok 12755 1727204098.09144: done checking to see if all hosts have failed 12755 1727204098.09145: getting the remaining hosts for this loop 12755 1727204098.09146: done getting the remaining hosts for this loop 12755 1727204098.09150: getting the next task for host managed-node1 12755 1727204098.09153: done getting next task for host managed-node1 12755 1727204098.09155: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 12755 1727204098.09157: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204098.09159: getting variables 12755 1727204098.09159: in VariableManager get_vars() 12755 1727204098.09225: Calling all_inventory to load vars for managed-node1 12755 1727204098.09227: Calling groups_inventory to load vars for managed-node1 12755 1727204098.09229: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204098.09234: Calling all_plugins_play to load vars for managed-node1 12755 1727204098.09236: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204098.09238: Calling groups_plugins_play to load vars for managed-node1 12755 1727204098.10317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204098.11853: done with get_vars() 12755 1727204098.11873: done getting variables 12755 1727204098.11908: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:54:58 -0400 (0:00:00.088) 0:00:23.355 ***** 12755 1727204098.11932: entering _queue_task() for managed-node1/set_fact 12755 1727204098.12205: worker is 1 (out of 1 available) 12755 1727204098.12220: exiting _queue_task() for managed-node1/set_fact 12755 1727204098.12236: done queuing things up, now waiting for results queue to drain 12755 1727204098.12237: waiting for pending results... 12755 1727204098.12421: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 12755 1727204098.12507: in run() - task 12b410aa-8751-72e9-1a19-0000000005e4 12755 1727204098.12520: variable 'ansible_search_path' from source: unknown 12755 1727204098.12526: variable 'ansible_search_path' from source: unknown 12755 1727204098.12560: calling self._execute() 12755 1727204098.12641: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204098.12648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204098.12659: variable 'omit' from source: magic vars 12755 1727204098.12979: variable 'ansible_distribution_major_version' from source: facts 12755 1727204098.12992: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204098.12999: variable 'omit' from source: magic vars 12755 1727204098.13043: variable 'omit' from source: magic vars 12755 1727204098.13074: variable 'omit' from source: magic vars 12755 1727204098.13112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204098.13146: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204098.13165: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204098.13181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204098.13194: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204098.13228: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204098.13232: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204098.13236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204098.13320: Set connection var ansible_connection to ssh 12755 1727204098.13328: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204098.13333: Set connection var ansible_shell_type to sh 12755 1727204098.13344: Set connection var ansible_timeout to 10 12755 1727204098.13353: Set connection var ansible_shell_executable to /bin/sh 12755 1727204098.13359: Set connection var ansible_pipelining to False 12755 1727204098.13379: variable 'ansible_shell_executable' from source: unknown 12755 1727204098.13382: variable 'ansible_connection' from source: unknown 12755 1727204098.13385: variable 'ansible_module_compression' from source: unknown 12755 1727204098.13388: variable 'ansible_shell_type' from source: unknown 12755 1727204098.13394: variable 'ansible_shell_executable' from source: unknown 12755 1727204098.13397: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204098.13403: variable 'ansible_pipelining' from source: unknown 12755 1727204098.13406: variable 'ansible_timeout' from source: unknown 12755 1727204098.13411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204098.13531: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204098.13541: variable 'omit' from source: magic vars 12755 1727204098.13547: starting attempt loop 12755 1727204098.13552: running the handler 12755 1727204098.13566: handler run complete 12755 1727204098.13576: attempt loop complete, returning result 12755 1727204098.13579: _execute() done 12755 1727204098.13583: dumping result to json 12755 1727204098.13585: done dumping result, returning 12755 1727204098.13595: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [12b410aa-8751-72e9-1a19-0000000005e4] 12755 1727204098.13600: sending task result for task 12b410aa-8751-72e9-1a19-0000000005e4 12755 1727204098.13702: done sending task result for task 12b410aa-8751-72e9-1a19-0000000005e4 12755 1727204098.13705: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 12755 1727204098.13762: no more pending results, returning what we have 12755 1727204098.13765: results queue empty 12755 1727204098.13766: checking for any_errors_fatal 12755 1727204098.13768: done checking for any_errors_fatal 12755 1727204098.13768: checking for max_fail_percentage 12755 1727204098.13770: done checking for max_fail_percentage 12755 1727204098.13771: checking to see if all hosts have failed and the running result is not ok 12755 1727204098.13772: done checking to see if all hosts have failed 12755 1727204098.13773: getting the remaining hosts for this loop 12755 1727204098.13781: done getting the remaining hosts for this loop 12755 1727204098.13786: getting the next task for host managed-node1 12755 1727204098.13795: done getting next task for host managed-node1 12755 1727204098.13798: ^ task is: TASK: Stat profile file 12755 1727204098.13802: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204098.13807: getting variables 12755 1727204098.13809: in VariableManager get_vars() 12755 1727204098.13862: Calling all_inventory to load vars for managed-node1 12755 1727204098.13865: Calling groups_inventory to load vars for managed-node1 12755 1727204098.13868: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204098.13879: Calling all_plugins_play to load vars for managed-node1 12755 1727204098.13882: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204098.13886: Calling groups_plugins_play to load vars for managed-node1 12755 1727204098.15181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204098.17384: done with get_vars() 12755 1727204098.17406: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:54:58 -0400 (0:00:00.055) 0:00:23.410 ***** 12755 1727204098.17478: entering _queue_task() for managed-node1/stat 12755 1727204098.17701: worker is 1 (out of 1 available) 12755 1727204098.17717: exiting _queue_task() for managed-node1/stat 12755 1727204098.17731: done queuing things up, now waiting for results queue to drain 12755 1727204098.17732: waiting for pending results... 12755 1727204098.17910: running TaskExecutor() for managed-node1/TASK: Stat profile file 12755 1727204098.17994: in run() - task 12b410aa-8751-72e9-1a19-0000000005e5 12755 1727204098.18007: variable 'ansible_search_path' from source: unknown 12755 1727204098.18010: variable 'ansible_search_path' from source: unknown 12755 1727204098.18044: calling self._execute() 12755 1727204098.18118: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204098.18128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204098.18139: variable 'omit' from source: magic vars 12755 1727204098.18455: variable 'ansible_distribution_major_version' from source: facts 12755 1727204098.18694: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204098.18699: variable 'omit' from source: magic vars 12755 1727204098.18701: variable 'omit' from source: magic vars 12755 1727204098.18704: variable 'profile' from source: include params 12755 1727204098.18706: variable 'item' from source: include params 12755 1727204098.18759: variable 'item' from source: include params 12755 1727204098.18787: variable 'omit' from source: magic vars 12755 1727204098.18848: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204098.18895: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204098.19108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204098.19134: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204098.19152: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204098.19191: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204098.19394: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204098.19398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204098.19458: Set connection var ansible_connection to ssh 12755 1727204098.19471: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204098.19478: Set connection var ansible_shell_type to sh 12755 1727204098.19499: Set connection var ansible_timeout to 10 12755 1727204098.19511: Set connection var ansible_shell_executable to /bin/sh 12755 1727204098.19523: Set connection var ansible_pipelining to False 12755 1727204098.19557: variable 'ansible_shell_executable' from source: unknown 12755 1727204098.19566: variable 'ansible_connection' from source: unknown 12755 1727204098.19573: variable 'ansible_module_compression' from source: unknown 12755 1727204098.19580: variable 'ansible_shell_type' from source: unknown 12755 1727204098.19587: variable 'ansible_shell_executable' from source: unknown 12755 1727204098.19597: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204098.19606: variable 'ansible_pipelining' from source: unknown 12755 1727204098.19614: variable 'ansible_timeout' from source: unknown 12755 1727204098.19622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204098.19866: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204098.19968: variable 'omit' from source: magic vars 12755 1727204098.19971: starting attempt loop 12755 1727204098.19974: running the handler 12755 1727204098.19976: _low_level_execute_command(): starting 12755 1727204098.19978: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204098.20731: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204098.20751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204098.20768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204098.20858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204098.20905: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204098.20930: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204098.20976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204098.21023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204098.23042: stdout chunk (state=3): >>>/root <<< 12755 1727204098.23150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204098.23498: stderr chunk (state=3): >>><<< 12755 1727204098.23502: stdout chunk (state=3): >>><<< 12755 1727204098.23510: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204098.23512: _low_level_execute_command(): starting 12755 1727204098.23518: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204098.2339373-14342-245098384636079 `" && echo ansible-tmp-1727204098.2339373-14342-245098384636079="` echo /root/.ansible/tmp/ansible-tmp-1727204098.2339373-14342-245098384636079 `" ) && sleep 0' 12755 1727204098.24633: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204098.24648: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204098.24721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204098.24855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204098.24897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204098.24946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204098.25028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204098.27426: stdout chunk (state=3): >>>ansible-tmp-1727204098.2339373-14342-245098384636079=/root/.ansible/tmp/ansible-tmp-1727204098.2339373-14342-245098384636079 <<< 12755 1727204098.27430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204098.27433: stderr chunk (state=3): >>><<< 12755 1727204098.27457: stdout chunk (state=3): >>><<< 12755 1727204098.27484: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204098.2339373-14342-245098384636079=/root/.ansible/tmp/ansible-tmp-1727204098.2339373-14342-245098384636079 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204098.27550: variable 'ansible_module_compression' from source: unknown 12755 1727204098.27620: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12755 1727204098.27688: variable 'ansible_facts' from source: unknown 12755 1727204098.27785: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204098.2339373-14342-245098384636079/AnsiballZ_stat.py 12755 1727204098.28187: Sending initial data 12755 1727204098.28198: Sent initial data (153 bytes) 12755 1727204098.29611: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204098.29947: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204098.29952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204098.30055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204098.32021: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 12755 1727204098.32030: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204098.32205: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204098.32238: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmp99_tztpg /root/.ansible/tmp/ansible-tmp-1727204098.2339373-14342-245098384636079/AnsiballZ_stat.py <<< 12755 1727204098.32244: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204098.2339373-14342-245098384636079/AnsiballZ_stat.py" <<< 12755 1727204098.32333: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmp99_tztpg" to remote "/root/.ansible/tmp/ansible-tmp-1727204098.2339373-14342-245098384636079/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204098.2339373-14342-245098384636079/AnsiballZ_stat.py" <<< 12755 1727204098.35077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204098.35196: stderr chunk (state=3): >>><<< 12755 1727204098.35200: stdout chunk (state=3): >>><<< 12755 1727204098.35215: done transferring module to remote 12755 1727204098.35418: _low_level_execute_command(): starting 12755 1727204098.35422: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204098.2339373-14342-245098384636079/ /root/.ansible/tmp/ansible-tmp-1727204098.2339373-14342-245098384636079/AnsiballZ_stat.py && sleep 0' 12755 1727204098.36507: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204098.36557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204098.36678: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204098.36760: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204098.38938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204098.38965: stdout chunk (state=3): >>><<< 12755 1727204098.38968: stderr chunk (state=3): >>><<< 12755 1727204098.38994: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204098.39193: _low_level_execute_command(): starting 12755 1727204098.39197: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204098.2339373-14342-245098384636079/AnsiballZ_stat.py && sleep 0' 12755 1727204098.40520: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204098.40683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204098.40687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204098.40916: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204098.40977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204098.59396: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12755 1727204098.61327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204098.61331: stdout chunk (state=3): >>><<< 12755 1727204098.61334: stderr chunk (state=3): >>><<< 12755 1727204098.61336: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204098.61340: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204098.2339373-14342-245098384636079/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204098.61344: _low_level_execute_command(): starting 12755 1727204098.61346: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204098.2339373-14342-245098384636079/ > /dev/null 2>&1 && sleep 0' 12755 1727204098.62920: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204098.62924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204098.62927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204098.62954: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204098.63302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204098.65071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204098.65075: stdout chunk (state=3): >>><<< 12755 1727204098.65083: stderr chunk (state=3): >>><<< 12755 1727204098.65104: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204098.65112: handler run complete 12755 1727204098.65147: attempt loop complete, returning result 12755 1727204098.65151: _execute() done 12755 1727204098.65154: dumping result to json 12755 1727204098.65160: done dumping result, returning 12755 1727204098.65170: done running TaskExecutor() for managed-node1/TASK: Stat profile file [12b410aa-8751-72e9-1a19-0000000005e5] 12755 1727204098.65176: sending task result for task 12b410aa-8751-72e9-1a19-0000000005e5 ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 12755 1727204098.65573: no more pending results, returning what we have 12755 1727204098.65577: results queue empty 12755 1727204098.65578: checking for any_errors_fatal 12755 1727204098.65588: done checking for any_errors_fatal 12755 1727204098.65592: checking for max_fail_percentage 12755 1727204098.65594: done checking for max_fail_percentage 12755 1727204098.65594: checking to see if all hosts have failed and the running result is not ok 12755 1727204098.65595: done checking to see if all hosts have failed 12755 1727204098.65596: getting the remaining hosts for this loop 12755 1727204098.65598: done getting the remaining hosts for this loop 12755 1727204098.65603: getting the next task for host managed-node1 12755 1727204098.65612: done getting next task for host managed-node1 12755 1727204098.65614: ^ task is: TASK: Set NM profile exist flag based on the profile files 12755 1727204098.65618: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204098.65624: getting variables 12755 1727204098.65626: in VariableManager get_vars() 12755 1727204098.65687: Calling all_inventory to load vars for managed-node1 12755 1727204098.65894: Calling groups_inventory to load vars for managed-node1 12755 1727204098.65899: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204098.65913: Calling all_plugins_play to load vars for managed-node1 12755 1727204098.65917: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204098.65922: Calling groups_plugins_play to load vars for managed-node1 12755 1727204098.66607: done sending task result for task 12b410aa-8751-72e9-1a19-0000000005e5 12755 1727204098.66611: WORKER PROCESS EXITING 12755 1727204098.72074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204098.78799: done with get_vars() 12755 1727204098.78848: done getting variables 12755 1727204098.78927: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:54:58 -0400 (0:00:00.614) 0:00:24.025 ***** 12755 1727204098.78963: entering _queue_task() for managed-node1/set_fact 12755 1727204098.79749: worker is 1 (out of 1 available) 12755 1727204098.79764: exiting _queue_task() for managed-node1/set_fact 12755 1727204098.79779: done queuing things up, now waiting for results queue to drain 12755 1727204098.79780: waiting for pending results... 12755 1727204098.80809: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 12755 1727204098.81397: in run() - task 12b410aa-8751-72e9-1a19-0000000005e6 12755 1727204098.81402: variable 'ansible_search_path' from source: unknown 12755 1727204098.81405: variable 'ansible_search_path' from source: unknown 12755 1727204098.81408: calling self._execute() 12755 1727204098.81996: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204098.82000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204098.82004: variable 'omit' from source: magic vars 12755 1727204098.83078: variable 'ansible_distribution_major_version' from source: facts 12755 1727204098.83103: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204098.83673: variable 'profile_stat' from source: set_fact 12755 1727204098.83699: Evaluated conditional (profile_stat.stat.exists): False 12755 1727204098.83710: when evaluation is False, skipping this task 12755 1727204098.83718: _execute() done 12755 1727204098.83728: dumping result to json 12755 1727204098.83738: done dumping result, returning 12755 1727204098.83752: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [12b410aa-8751-72e9-1a19-0000000005e6] 12755 1727204098.83763: sending task result for task 12b410aa-8751-72e9-1a19-0000000005e6 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12755 1727204098.84419: no more pending results, returning what we have 12755 1727204098.84425: results queue empty 12755 1727204098.84427: checking for any_errors_fatal 12755 1727204098.84438: done checking for any_errors_fatal 12755 1727204098.84439: checking for max_fail_percentage 12755 1727204098.84441: done checking for max_fail_percentage 12755 1727204098.84442: checking to see if all hosts have failed and the running result is not ok 12755 1727204098.84443: done checking to see if all hosts have failed 12755 1727204098.84444: getting the remaining hosts for this loop 12755 1727204098.84446: done getting the remaining hosts for this loop 12755 1727204098.84451: getting the next task for host managed-node1 12755 1727204098.84458: done getting next task for host managed-node1 12755 1727204098.84461: ^ task is: TASK: Get NM profile info 12755 1727204098.84465: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204098.84469: getting variables 12755 1727204098.84471: in VariableManager get_vars() 12755 1727204098.84541: Calling all_inventory to load vars for managed-node1 12755 1727204098.84545: Calling groups_inventory to load vars for managed-node1 12755 1727204098.84549: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204098.84565: Calling all_plugins_play to load vars for managed-node1 12755 1727204098.84570: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204098.84574: Calling groups_plugins_play to load vars for managed-node1 12755 1727204098.85300: done sending task result for task 12b410aa-8751-72e9-1a19-0000000005e6 12755 1727204098.86095: WORKER PROCESS EXITING 12755 1727204098.89650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204098.96039: done with get_vars() 12755 1727204098.96084: done getting variables 12755 1727204098.96167: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:54:58 -0400 (0:00:00.174) 0:00:24.199 ***** 12755 1727204098.96419: entering _queue_task() for managed-node1/shell 12755 1727204098.97242: worker is 1 (out of 1 available) 12755 1727204098.97257: exiting _queue_task() for managed-node1/shell 12755 1727204098.97270: done queuing things up, now waiting for results queue to drain 12755 1727204098.97272: waiting for pending results... 12755 1727204098.97815: running TaskExecutor() for managed-node1/TASK: Get NM profile info 12755 1727204098.98696: in run() - task 12b410aa-8751-72e9-1a19-0000000005e7 12755 1727204098.98700: variable 'ansible_search_path' from source: unknown 12755 1727204098.98702: variable 'ansible_search_path' from source: unknown 12755 1727204098.98706: calling self._execute() 12755 1727204098.98709: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204098.98712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204098.99295: variable 'omit' from source: magic vars 12755 1727204099.00343: variable 'ansible_distribution_major_version' from source: facts 12755 1727204099.00365: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204099.00379: variable 'omit' from source: magic vars 12755 1727204099.00443: variable 'omit' from source: magic vars 12755 1727204099.00970: variable 'profile' from source: include params 12755 1727204099.00982: variable 'item' from source: include params 12755 1727204099.01066: variable 'item' from source: include params 12755 1727204099.01322: variable 'omit' from source: magic vars 12755 1727204099.01379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204099.01431: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204099.01620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204099.01649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204099.01669: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204099.01934: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204099.01946: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204099.01957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204099.02086: Set connection var ansible_connection to ssh 12755 1727204099.02306: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204099.02315: Set connection var ansible_shell_type to sh 12755 1727204099.02334: Set connection var ansible_timeout to 10 12755 1727204099.02348: Set connection var ansible_shell_executable to /bin/sh 12755 1727204099.02795: Set connection var ansible_pipelining to False 12755 1727204099.02799: variable 'ansible_shell_executable' from source: unknown 12755 1727204099.02801: variable 'ansible_connection' from source: unknown 12755 1727204099.02806: variable 'ansible_module_compression' from source: unknown 12755 1727204099.02809: variable 'ansible_shell_type' from source: unknown 12755 1727204099.02811: variable 'ansible_shell_executable' from source: unknown 12755 1727204099.02814: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204099.02817: variable 'ansible_pipelining' from source: unknown 12755 1727204099.02995: variable 'ansible_timeout' from source: unknown 12755 1727204099.02999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204099.03236: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204099.03256: variable 'omit' from source: magic vars 12755 1727204099.03269: starting attempt loop 12755 1727204099.03277: running the handler 12755 1727204099.03297: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204099.03323: _low_level_execute_command(): starting 12755 1727204099.03406: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204099.05146: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204099.05410: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 12755 1727204099.05437: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204099.05461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204099.05543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204099.07308: stdout chunk (state=3): >>>/root <<< 12755 1727204099.07484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204099.07504: stdout chunk (state=3): >>><<< 12755 1727204099.07520: stderr chunk (state=3): >>><<< 12755 1727204099.07548: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204099.07570: _low_level_execute_command(): starting 12755 1727204099.07582: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204099.0755584-14431-169672025618101 `" && echo ansible-tmp-1727204099.0755584-14431-169672025618101="` echo /root/.ansible/tmp/ansible-tmp-1727204099.0755584-14431-169672025618101 `" ) && sleep 0' 12755 1727204099.08804: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204099.08836: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204099.08943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204099.09070: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204099.09167: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204099.09195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204099.09278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204099.11361: stdout chunk (state=3): >>>ansible-tmp-1727204099.0755584-14431-169672025618101=/root/.ansible/tmp/ansible-tmp-1727204099.0755584-14431-169672025618101 <<< 12755 1727204099.11482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204099.11606: stderr chunk (state=3): >>><<< 12755 1727204099.11618: stdout chunk (state=3): >>><<< 12755 1727204099.11897: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204099.0755584-14431-169672025618101=/root/.ansible/tmp/ansible-tmp-1727204099.0755584-14431-169672025618101 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204099.11900: variable 'ansible_module_compression' from source: unknown 12755 1727204099.11913: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12755 1727204099.11962: variable 'ansible_facts' from source: unknown 12755 1727204099.12179: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204099.0755584-14431-169672025618101/AnsiballZ_command.py 12755 1727204099.12573: Sending initial data 12755 1727204099.12583: Sent initial data (156 bytes) 12755 1727204099.13770: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204099.13992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204099.14017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204099.14035: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204099.14058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204099.14221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204099.17300: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204099.17327: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204099.17393: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmppls3x1eb /root/.ansible/tmp/ansible-tmp-1727204099.0755584-14431-169672025618101/AnsiballZ_command.py <<< 12755 1727204099.17398: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204099.0755584-14431-169672025618101/AnsiballZ_command.py" <<< 12755 1727204099.17452: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmppls3x1eb" to remote "/root/.ansible/tmp/ansible-tmp-1727204099.0755584-14431-169672025618101/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204099.0755584-14431-169672025618101/AnsiballZ_command.py" <<< 12755 1727204099.19608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204099.19619: stdout chunk (state=3): >>><<< 12755 1727204099.19634: stderr chunk (state=3): >>><<< 12755 1727204099.19721: done transferring module to remote 12755 1727204099.19740: _low_level_execute_command(): starting 12755 1727204099.19751: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204099.0755584-14431-169672025618101/ /root/.ansible/tmp/ansible-tmp-1727204099.0755584-14431-169672025618101/AnsiballZ_command.py && sleep 0' 12755 1727204099.21108: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204099.21154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204099.21180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204099.21199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204099.21464: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204099.23554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204099.23560: stdout chunk (state=3): >>><<< 12755 1727204099.23563: stderr chunk (state=3): >>><<< 12755 1727204099.23583: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204099.23596: _low_level_execute_command(): starting 12755 1727204099.23608: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204099.0755584-14431-169672025618101/AnsiballZ_command.py && sleep 0' 12755 1727204099.24907: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204099.25133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204099.25222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204099.45568: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0 /etc/NetworkManager/system-connections/bond0.nmconnection \nbond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-24 14:54:59.431286", "end": "2024-09-24 14:54:59.454491", "delta": "0:00:00.023205", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12755 1727204099.47422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204099.47621: stderr chunk (state=3): >>><<< 12755 1727204099.47625: stdout chunk (state=3): >>><<< 12755 1727204099.47798: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0 /etc/NetworkManager/system-connections/bond0.nmconnection \nbond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-24 14:54:59.431286", "end": "2024-09-24 14:54:59.454491", "delta": "0:00:00.023205", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204099.47802: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204099.0755584-14431-169672025618101/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204099.47806: _low_level_execute_command(): starting 12755 1727204099.47808: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204099.0755584-14431-169672025618101/ > /dev/null 2>&1 && sleep 0' 12755 1727204099.48919: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204099.49113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204099.49329: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204099.49374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204099.51696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204099.51700: stdout chunk (state=3): >>><<< 12755 1727204099.51703: stderr chunk (state=3): >>><<< 12755 1727204099.51705: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204099.51708: handler run complete 12755 1727204099.51710: Evaluated conditional (False): False 12755 1727204099.51712: attempt loop complete, returning result 12755 1727204099.51714: _execute() done 12755 1727204099.51716: dumping result to json 12755 1727204099.51718: done dumping result, returning 12755 1727204099.51720: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [12b410aa-8751-72e9-1a19-0000000005e7] 12755 1727204099.51723: sending task result for task 12b410aa-8751-72e9-1a19-0000000005e7 12755 1727204099.51802: done sending task result for task 12b410aa-8751-72e9-1a19-0000000005e7 12755 1727204099.51805: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.023205", "end": "2024-09-24 14:54:59.454491", "rc": 0, "start": "2024-09-24 14:54:59.431286" } STDOUT: bond0 /etc/NetworkManager/system-connections/bond0.nmconnection bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 12755 1727204099.51906: no more pending results, returning what we have 12755 1727204099.51910: results queue empty 12755 1727204099.51912: checking for any_errors_fatal 12755 1727204099.51920: done checking for any_errors_fatal 12755 1727204099.51922: checking for max_fail_percentage 12755 1727204099.51924: done checking for max_fail_percentage 12755 1727204099.51925: checking to see if all hosts have failed and the running result is not ok 12755 1727204099.51926: done checking to see if all hosts have failed 12755 1727204099.51927: getting the remaining hosts for this loop 12755 1727204099.51929: done getting the remaining hosts for this loop 12755 1727204099.51934: getting the next task for host managed-node1 12755 1727204099.51943: done getting next task for host managed-node1 12755 1727204099.51946: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12755 1727204099.51952: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204099.51957: getting variables 12755 1727204099.51959: in VariableManager get_vars() 12755 1727204099.52247: Calling all_inventory to load vars for managed-node1 12755 1727204099.52251: Calling groups_inventory to load vars for managed-node1 12755 1727204099.52254: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204099.52270: Calling all_plugins_play to load vars for managed-node1 12755 1727204099.52273: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204099.52278: Calling groups_plugins_play to load vars for managed-node1 12755 1727204099.56943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204099.61256: done with get_vars() 12755 1727204099.61306: done getting variables 12755 1727204099.61385: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:54:59 -0400 (0:00:00.650) 0:00:24.849 ***** 12755 1727204099.61429: entering _queue_task() for managed-node1/set_fact 12755 1727204099.61931: worker is 1 (out of 1 available) 12755 1727204099.61944: exiting _queue_task() for managed-node1/set_fact 12755 1727204099.61958: done queuing things up, now waiting for results queue to drain 12755 1727204099.61959: waiting for pending results... 12755 1727204099.62309: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12755 1727204099.62361: in run() - task 12b410aa-8751-72e9-1a19-0000000005e8 12755 1727204099.62385: variable 'ansible_search_path' from source: unknown 12755 1727204099.62398: variable 'ansible_search_path' from source: unknown 12755 1727204099.62454: calling self._execute() 12755 1727204099.62819: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204099.62825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204099.62828: variable 'omit' from source: magic vars 12755 1727204099.63768: variable 'ansible_distribution_major_version' from source: facts 12755 1727204099.63788: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204099.63987: variable 'nm_profile_exists' from source: set_fact 12755 1727204099.64015: Evaluated conditional (nm_profile_exists.rc == 0): True 12755 1727204099.64037: variable 'omit' from source: magic vars 12755 1727204099.64344: variable 'omit' from source: magic vars 12755 1727204099.64349: variable 'omit' from source: magic vars 12755 1727204099.64391: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204099.64439: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204099.64475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204099.64795: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204099.64799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204099.64807: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204099.64810: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204099.64812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204099.65245: Set connection var ansible_connection to ssh 12755 1727204099.65250: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204099.65253: Set connection var ansible_shell_type to sh 12755 1727204099.65257: Set connection var ansible_timeout to 10 12755 1727204099.65295: Set connection var ansible_shell_executable to /bin/sh 12755 1727204099.65299: Set connection var ansible_pipelining to False 12755 1727204099.65316: variable 'ansible_shell_executable' from source: unknown 12755 1727204099.65371: variable 'ansible_connection' from source: unknown 12755 1727204099.65375: variable 'ansible_module_compression' from source: unknown 12755 1727204099.65377: variable 'ansible_shell_type' from source: unknown 12755 1727204099.65494: variable 'ansible_shell_executable' from source: unknown 12755 1727204099.65498: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204099.65501: variable 'ansible_pipelining' from source: unknown 12755 1727204099.65503: variable 'ansible_timeout' from source: unknown 12755 1727204099.65507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204099.65899: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204099.65903: variable 'omit' from source: magic vars 12755 1727204099.65906: starting attempt loop 12755 1727204099.65908: running the handler 12755 1727204099.65927: handler run complete 12755 1727204099.65944: attempt loop complete, returning result 12755 1727204099.66035: _execute() done 12755 1727204099.66064: dumping result to json 12755 1727204099.66074: done dumping result, returning 12755 1727204099.66088: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12b410aa-8751-72e9-1a19-0000000005e8] 12755 1727204099.66123: sending task result for task 12b410aa-8751-72e9-1a19-0000000005e8 ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 12755 1727204099.66444: no more pending results, returning what we have 12755 1727204099.66447: results queue empty 12755 1727204099.66449: checking for any_errors_fatal 12755 1727204099.66461: done checking for any_errors_fatal 12755 1727204099.66462: checking for max_fail_percentage 12755 1727204099.66464: done checking for max_fail_percentage 12755 1727204099.66465: checking to see if all hosts have failed and the running result is not ok 12755 1727204099.66467: done checking to see if all hosts have failed 12755 1727204099.66468: getting the remaining hosts for this loop 12755 1727204099.66470: done getting the remaining hosts for this loop 12755 1727204099.66476: getting the next task for host managed-node1 12755 1727204099.66488: done getting next task for host managed-node1 12755 1727204099.66492: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 12755 1727204099.66498: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204099.66504: getting variables 12755 1727204099.66506: in VariableManager get_vars() 12755 1727204099.66575: Calling all_inventory to load vars for managed-node1 12755 1727204099.66579: Calling groups_inventory to load vars for managed-node1 12755 1727204099.66582: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204099.66706: Calling all_plugins_play to load vars for managed-node1 12755 1727204099.66710: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204099.66716: Calling groups_plugins_play to load vars for managed-node1 12755 1727204099.67369: done sending task result for task 12b410aa-8751-72e9-1a19-0000000005e8 12755 1727204099.67373: WORKER PROCESS EXITING 12755 1727204099.70477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204099.74966: done with get_vars() 12755 1727204099.75014: done getting variables 12755 1727204099.75086: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204099.75559: variable 'profile' from source: include params 12755 1727204099.75564: variable 'item' from source: include params 12755 1727204099.75706: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:54:59 -0400 (0:00:00.143) 0:00:24.993 ***** 12755 1727204099.75752: entering _queue_task() for managed-node1/command 12755 1727204099.76439: worker is 1 (out of 1 available) 12755 1727204099.76696: exiting _queue_task() for managed-node1/command 12755 1727204099.76708: done queuing things up, now waiting for results queue to drain 12755 1727204099.76709: waiting for pending results... 12755 1727204099.76782: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0 12755 1727204099.76950: in run() - task 12b410aa-8751-72e9-1a19-0000000005ea 12755 1727204099.76974: variable 'ansible_search_path' from source: unknown 12755 1727204099.76983: variable 'ansible_search_path' from source: unknown 12755 1727204099.77033: calling self._execute() 12755 1727204099.77155: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204099.77170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204099.77186: variable 'omit' from source: magic vars 12755 1727204099.77630: variable 'ansible_distribution_major_version' from source: facts 12755 1727204099.77650: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204099.77827: variable 'profile_stat' from source: set_fact 12755 1727204099.77846: Evaluated conditional (profile_stat.stat.exists): False 12755 1727204099.77854: when evaluation is False, skipping this task 12755 1727204099.77861: _execute() done 12755 1727204099.77868: dumping result to json 12755 1727204099.77876: done dumping result, returning 12755 1727204099.77888: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0 [12b410aa-8751-72e9-1a19-0000000005ea] 12755 1727204099.77901: sending task result for task 12b410aa-8751-72e9-1a19-0000000005ea skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12755 1727204099.78080: no more pending results, returning what we have 12755 1727204099.78085: results queue empty 12755 1727204099.78087: checking for any_errors_fatal 12755 1727204099.78096: done checking for any_errors_fatal 12755 1727204099.78098: checking for max_fail_percentage 12755 1727204099.78100: done checking for max_fail_percentage 12755 1727204099.78101: checking to see if all hosts have failed and the running result is not ok 12755 1727204099.78102: done checking to see if all hosts have failed 12755 1727204099.78103: getting the remaining hosts for this loop 12755 1727204099.78104: done getting the remaining hosts for this loop 12755 1727204099.78110: getting the next task for host managed-node1 12755 1727204099.78122: done getting next task for host managed-node1 12755 1727204099.78125: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 12755 1727204099.78133: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204099.78140: getting variables 12755 1727204099.78142: in VariableManager get_vars() 12755 1727204099.78261: Calling all_inventory to load vars for managed-node1 12755 1727204099.78265: Calling groups_inventory to load vars for managed-node1 12755 1727204099.78267: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204099.78285: Calling all_plugins_play to load vars for managed-node1 12755 1727204099.78288: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204099.78295: Calling groups_plugins_play to load vars for managed-node1 12755 1727204099.78975: done sending task result for task 12b410aa-8751-72e9-1a19-0000000005ea 12755 1727204099.78979: WORKER PROCESS EXITING 12755 1727204099.81039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204099.84108: done with get_vars() 12755 1727204099.84155: done getting variables 12755 1727204099.84230: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204099.84374: variable 'profile' from source: include params 12755 1727204099.84378: variable 'item' from source: include params 12755 1727204099.84456: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:54:59 -0400 (0:00:00.087) 0:00:25.080 ***** 12755 1727204099.84500: entering _queue_task() for managed-node1/set_fact 12755 1727204099.84887: worker is 1 (out of 1 available) 12755 1727204099.85008: exiting _queue_task() for managed-node1/set_fact 12755 1727204099.85023: done queuing things up, now waiting for results queue to drain 12755 1727204099.85025: waiting for pending results... 12755 1727204099.85267: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0 12755 1727204099.85433: in run() - task 12b410aa-8751-72e9-1a19-0000000005eb 12755 1727204099.85496: variable 'ansible_search_path' from source: unknown 12755 1727204099.85500: variable 'ansible_search_path' from source: unknown 12755 1727204099.85525: calling self._execute() 12755 1727204099.85642: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204099.85661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204099.85694: variable 'omit' from source: magic vars 12755 1727204099.86145: variable 'ansible_distribution_major_version' from source: facts 12755 1727204099.86208: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204099.86350: variable 'profile_stat' from source: set_fact 12755 1727204099.86373: Evaluated conditional (profile_stat.stat.exists): False 12755 1727204099.86383: when evaluation is False, skipping this task 12755 1727204099.86394: _execute() done 12755 1727204099.86403: dumping result to json 12755 1727204099.86428: done dumping result, returning 12755 1727204099.86431: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0 [12b410aa-8751-72e9-1a19-0000000005eb] 12755 1727204099.86444: sending task result for task 12b410aa-8751-72e9-1a19-0000000005eb 12755 1727204099.86755: done sending task result for task 12b410aa-8751-72e9-1a19-0000000005eb 12755 1727204099.86759: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12755 1727204099.86813: no more pending results, returning what we have 12755 1727204099.86819: results queue empty 12755 1727204099.86821: checking for any_errors_fatal 12755 1727204099.86826: done checking for any_errors_fatal 12755 1727204099.86827: checking for max_fail_percentage 12755 1727204099.86829: done checking for max_fail_percentage 12755 1727204099.86829: checking to see if all hosts have failed and the running result is not ok 12755 1727204099.86831: done checking to see if all hosts have failed 12755 1727204099.86831: getting the remaining hosts for this loop 12755 1727204099.86833: done getting the remaining hosts for this loop 12755 1727204099.86838: getting the next task for host managed-node1 12755 1727204099.86845: done getting next task for host managed-node1 12755 1727204099.86848: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 12755 1727204099.86852: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204099.86857: getting variables 12755 1727204099.86858: in VariableManager get_vars() 12755 1727204099.87054: Calling all_inventory to load vars for managed-node1 12755 1727204099.87058: Calling groups_inventory to load vars for managed-node1 12755 1727204099.87061: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204099.87073: Calling all_plugins_play to load vars for managed-node1 12755 1727204099.87077: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204099.87081: Calling groups_plugins_play to load vars for managed-node1 12755 1727204099.89452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204099.92551: done with get_vars() 12755 1727204099.92591: done getting variables 12755 1727204099.92668: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204099.92807: variable 'profile' from source: include params 12755 1727204099.92811: variable 'item' from source: include params 12755 1727204099.92898: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:54:59 -0400 (0:00:00.084) 0:00:25.165 ***** 12755 1727204099.92936: entering _queue_task() for managed-node1/command 12755 1727204099.93420: worker is 1 (out of 1 available) 12755 1727204099.93433: exiting _queue_task() for managed-node1/command 12755 1727204099.93445: done queuing things up, now waiting for results queue to drain 12755 1727204099.93447: waiting for pending results... 12755 1727204099.93663: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0 12755 1727204099.93836: in run() - task 12b410aa-8751-72e9-1a19-0000000005ec 12755 1727204099.93936: variable 'ansible_search_path' from source: unknown 12755 1727204099.93940: variable 'ansible_search_path' from source: unknown 12755 1727204099.93944: calling self._execute() 12755 1727204099.94030: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204099.94051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204099.94075: variable 'omit' from source: magic vars 12755 1727204099.94535: variable 'ansible_distribution_major_version' from source: facts 12755 1727204099.94555: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204099.94733: variable 'profile_stat' from source: set_fact 12755 1727204099.94754: Evaluated conditional (profile_stat.stat.exists): False 12755 1727204099.94807: when evaluation is False, skipping this task 12755 1727204099.94812: _execute() done 12755 1727204099.94815: dumping result to json 12755 1727204099.94820: done dumping result, returning 12755 1727204099.94823: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0 [12b410aa-8751-72e9-1a19-0000000005ec] 12755 1727204099.94826: sending task result for task 12b410aa-8751-72e9-1a19-0000000005ec 12755 1727204099.94991: done sending task result for task 12b410aa-8751-72e9-1a19-0000000005ec 12755 1727204099.94994: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12755 1727204099.95064: no more pending results, returning what we have 12755 1727204099.95069: results queue empty 12755 1727204099.95070: checking for any_errors_fatal 12755 1727204099.95078: done checking for any_errors_fatal 12755 1727204099.95079: checking for max_fail_percentage 12755 1727204099.95081: done checking for max_fail_percentage 12755 1727204099.95082: checking to see if all hosts have failed and the running result is not ok 12755 1727204099.95083: done checking to see if all hosts have failed 12755 1727204099.95084: getting the remaining hosts for this loop 12755 1727204099.95086: done getting the remaining hosts for this loop 12755 1727204099.95095: getting the next task for host managed-node1 12755 1727204099.95103: done getting next task for host managed-node1 12755 1727204099.95106: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 12755 1727204099.95110: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204099.95119: getting variables 12755 1727204099.95121: in VariableManager get_vars() 12755 1727204099.95185: Calling all_inventory to load vars for managed-node1 12755 1727204099.95406: Calling groups_inventory to load vars for managed-node1 12755 1727204099.95411: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204099.95424: Calling all_plugins_play to load vars for managed-node1 12755 1727204099.95429: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204099.95433: Calling groups_plugins_play to load vars for managed-node1 12755 1727204099.97680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204100.00778: done with get_vars() 12755 1727204100.00830: done getting variables 12755 1727204100.00912: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204100.01063: variable 'profile' from source: include params 12755 1727204100.01067: variable 'item' from source: include params 12755 1727204100.01152: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:55:00 -0400 (0:00:00.082) 0:00:25.247 ***** 12755 1727204100.01205: entering _queue_task() for managed-node1/set_fact 12755 1727204100.01801: worker is 1 (out of 1 available) 12755 1727204100.01812: exiting _queue_task() for managed-node1/set_fact 12755 1727204100.01828: done queuing things up, now waiting for results queue to drain 12755 1727204100.01829: waiting for pending results... 12755 1727204100.02071: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0 12755 1727204100.02125: in run() - task 12b410aa-8751-72e9-1a19-0000000005ed 12755 1727204100.02168: variable 'ansible_search_path' from source: unknown 12755 1727204100.02171: variable 'ansible_search_path' from source: unknown 12755 1727204100.02207: calling self._execute() 12755 1727204100.02387: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204100.02393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204100.02396: variable 'omit' from source: magic vars 12755 1727204100.02844: variable 'ansible_distribution_major_version' from source: facts 12755 1727204100.02865: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204100.03049: variable 'profile_stat' from source: set_fact 12755 1727204100.03073: Evaluated conditional (profile_stat.stat.exists): False 12755 1727204100.03082: when evaluation is False, skipping this task 12755 1727204100.03093: _execute() done 12755 1727204100.03102: dumping result to json 12755 1727204100.03112: done dumping result, returning 12755 1727204100.03184: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0 [12b410aa-8751-72e9-1a19-0000000005ed] 12755 1727204100.03188: sending task result for task 12b410aa-8751-72e9-1a19-0000000005ed 12755 1727204100.03596: done sending task result for task 12b410aa-8751-72e9-1a19-0000000005ed 12755 1727204100.03600: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12755 1727204100.03651: no more pending results, returning what we have 12755 1727204100.03656: results queue empty 12755 1727204100.03657: checking for any_errors_fatal 12755 1727204100.03662: done checking for any_errors_fatal 12755 1727204100.03663: checking for max_fail_percentage 12755 1727204100.03665: done checking for max_fail_percentage 12755 1727204100.03666: checking to see if all hosts have failed and the running result is not ok 12755 1727204100.03667: done checking to see if all hosts have failed 12755 1727204100.03668: getting the remaining hosts for this loop 12755 1727204100.03670: done getting the remaining hosts for this loop 12755 1727204100.03673: getting the next task for host managed-node1 12755 1727204100.03682: done getting next task for host managed-node1 12755 1727204100.03686: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 12755 1727204100.03691: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204100.03696: getting variables 12755 1727204100.03698: in VariableManager get_vars() 12755 1727204100.03758: Calling all_inventory to load vars for managed-node1 12755 1727204100.03761: Calling groups_inventory to load vars for managed-node1 12755 1727204100.03765: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204100.03777: Calling all_plugins_play to load vars for managed-node1 12755 1727204100.03781: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204100.03785: Calling groups_plugins_play to load vars for managed-node1 12755 1727204100.13132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204100.19744: done with get_vars() 12755 1727204100.19912: done getting variables 12755 1727204100.19980: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204100.20337: variable 'profile' from source: include params 12755 1727204100.20341: variable 'item' from source: include params 12755 1727204100.20496: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:55:00 -0400 (0:00:00.193) 0:00:25.441 ***** 12755 1727204100.20614: entering _queue_task() for managed-node1/assert 12755 1727204100.21480: worker is 1 (out of 1 available) 12755 1727204100.21502: exiting _queue_task() for managed-node1/assert 12755 1727204100.21593: done queuing things up, now waiting for results queue to drain 12755 1727204100.21596: waiting for pending results... 12755 1727204100.22030: running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0' 12755 1727204100.22275: in run() - task 12b410aa-8751-72e9-1a19-000000000356 12755 1727204100.22289: variable 'ansible_search_path' from source: unknown 12755 1727204100.22294: variable 'ansible_search_path' from source: unknown 12755 1727204100.22466: calling self._execute() 12755 1727204100.22794: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204100.22809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204100.22821: variable 'omit' from source: magic vars 12755 1727204100.23740: variable 'ansible_distribution_major_version' from source: facts 12755 1727204100.23756: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204100.23766: variable 'omit' from source: magic vars 12755 1727204100.23821: variable 'omit' from source: magic vars 12755 1727204100.24112: variable 'profile' from source: include params 12755 1727204100.24119: variable 'item' from source: include params 12755 1727204100.24312: variable 'item' from source: include params 12755 1727204100.24336: variable 'omit' from source: magic vars 12755 1727204100.24501: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204100.24544: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204100.24567: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204100.24593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204100.24608: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204100.24675: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204100.24684: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204100.24687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204100.25086: Set connection var ansible_connection to ssh 12755 1727204100.25096: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204100.25099: Set connection var ansible_shell_type to sh 12755 1727204100.25195: Set connection var ansible_timeout to 10 12755 1727204100.25237: Set connection var ansible_shell_executable to /bin/sh 12755 1727204100.25246: Set connection var ansible_pipelining to False 12755 1727204100.25276: variable 'ansible_shell_executable' from source: unknown 12755 1727204100.25279: variable 'ansible_connection' from source: unknown 12755 1727204100.25282: variable 'ansible_module_compression' from source: unknown 12755 1727204100.25310: variable 'ansible_shell_type' from source: unknown 12755 1727204100.25313: variable 'ansible_shell_executable' from source: unknown 12755 1727204100.25346: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204100.25396: variable 'ansible_pipelining' from source: unknown 12755 1727204100.25401: variable 'ansible_timeout' from source: unknown 12755 1727204100.25404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204100.25648: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204100.25778: variable 'omit' from source: magic vars 12755 1727204100.25785: starting attempt loop 12755 1727204100.25788: running the handler 12755 1727204100.26047: variable 'lsr_net_profile_exists' from source: set_fact 12755 1727204100.26071: Evaluated conditional (lsr_net_profile_exists): True 12755 1727204100.26075: handler run complete 12755 1727204100.26078: attempt loop complete, returning result 12755 1727204100.26083: _execute() done 12755 1727204100.26086: dumping result to json 12755 1727204100.26209: done dumping result, returning 12755 1727204100.26212: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0' [12b410aa-8751-72e9-1a19-000000000356] 12755 1727204100.26215: sending task result for task 12b410aa-8751-72e9-1a19-000000000356 12755 1727204100.26375: done sending task result for task 12b410aa-8751-72e9-1a19-000000000356 12755 1727204100.26378: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12755 1727204100.26447: no more pending results, returning what we have 12755 1727204100.26451: results queue empty 12755 1727204100.26452: checking for any_errors_fatal 12755 1727204100.26460: done checking for any_errors_fatal 12755 1727204100.26461: checking for max_fail_percentage 12755 1727204100.26462: done checking for max_fail_percentage 12755 1727204100.26463: checking to see if all hosts have failed and the running result is not ok 12755 1727204100.26464: done checking to see if all hosts have failed 12755 1727204100.26465: getting the remaining hosts for this loop 12755 1727204100.26467: done getting the remaining hosts for this loop 12755 1727204100.26472: getting the next task for host managed-node1 12755 1727204100.26479: done getting next task for host managed-node1 12755 1727204100.26481: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 12755 1727204100.26485: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204100.26492: getting variables 12755 1727204100.26495: in VariableManager get_vars() 12755 1727204100.26567: Calling all_inventory to load vars for managed-node1 12755 1727204100.26571: Calling groups_inventory to load vars for managed-node1 12755 1727204100.26575: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204100.26793: Calling all_plugins_play to load vars for managed-node1 12755 1727204100.26799: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204100.26804: Calling groups_plugins_play to load vars for managed-node1 12755 1727204100.31612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204100.38706: done with get_vars() 12755 1727204100.38910: done getting variables 12755 1727204100.39106: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204100.39491: variable 'profile' from source: include params 12755 1727204100.39497: variable 'item' from source: include params 12755 1727204100.39744: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:55:00 -0400 (0:00:00.191) 0:00:25.633 ***** 12755 1727204100.39793: entering _queue_task() for managed-node1/assert 12755 1727204100.40650: worker is 1 (out of 1 available) 12755 1727204100.40667: exiting _queue_task() for managed-node1/assert 12755 1727204100.40680: done queuing things up, now waiting for results queue to drain 12755 1727204100.40681: waiting for pending results... 12755 1727204100.41474: running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0' 12755 1727204100.41894: in run() - task 12b410aa-8751-72e9-1a19-000000000357 12755 1727204100.41900: variable 'ansible_search_path' from source: unknown 12755 1727204100.41904: variable 'ansible_search_path' from source: unknown 12755 1727204100.42001: calling self._execute() 12755 1727204100.42674: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204100.42716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204100.42994: variable 'omit' from source: magic vars 12755 1727204100.43867: variable 'ansible_distribution_major_version' from source: facts 12755 1727204100.44108: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204100.44124: variable 'omit' from source: magic vars 12755 1727204100.44167: variable 'omit' from source: magic vars 12755 1727204100.44293: variable 'profile' from source: include params 12755 1727204100.44474: variable 'item' from source: include params 12755 1727204100.44494: variable 'item' from source: include params 12755 1727204100.44675: variable 'omit' from source: magic vars 12755 1727204100.44824: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204100.44867: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204100.44894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204100.44921: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204100.44936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204100.45026: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204100.45030: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204100.45033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204100.45378: Set connection var ansible_connection to ssh 12755 1727204100.45393: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204100.45397: Set connection var ansible_shell_type to sh 12755 1727204100.45448: Set connection var ansible_timeout to 10 12755 1727204100.45541: Set connection var ansible_shell_executable to /bin/sh 12755 1727204100.45547: Set connection var ansible_pipelining to False 12755 1727204100.45553: variable 'ansible_shell_executable' from source: unknown 12755 1727204100.45556: variable 'ansible_connection' from source: unknown 12755 1727204100.45558: variable 'ansible_module_compression' from source: unknown 12755 1727204100.45561: variable 'ansible_shell_type' from source: unknown 12755 1727204100.45563: variable 'ansible_shell_executable' from source: unknown 12755 1727204100.45565: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204100.45567: variable 'ansible_pipelining' from source: unknown 12755 1727204100.45570: variable 'ansible_timeout' from source: unknown 12755 1727204100.45572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204100.46075: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204100.46079: variable 'omit' from source: magic vars 12755 1727204100.46082: starting attempt loop 12755 1727204100.46085: running the handler 12755 1727204100.46262: variable 'lsr_net_profile_ansible_managed' from source: set_fact 12755 1727204100.46269: Evaluated conditional (lsr_net_profile_ansible_managed): True 12755 1727204100.46396: handler run complete 12755 1727204100.46415: attempt loop complete, returning result 12755 1727204100.46421: _execute() done 12755 1727204100.46425: dumping result to json 12755 1727204100.46427: done dumping result, returning 12755 1727204100.46456: done running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0' [12b410aa-8751-72e9-1a19-000000000357] 12755 1727204100.46460: sending task result for task 12b410aa-8751-72e9-1a19-000000000357 12755 1727204100.46674: done sending task result for task 12b410aa-8751-72e9-1a19-000000000357 12755 1727204100.46678: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12755 1727204100.46744: no more pending results, returning what we have 12755 1727204100.46749: results queue empty 12755 1727204100.46751: checking for any_errors_fatal 12755 1727204100.46757: done checking for any_errors_fatal 12755 1727204100.46758: checking for max_fail_percentage 12755 1727204100.46761: done checking for max_fail_percentage 12755 1727204100.46762: checking to see if all hosts have failed and the running result is not ok 12755 1727204100.46763: done checking to see if all hosts have failed 12755 1727204100.46764: getting the remaining hosts for this loop 12755 1727204100.46767: done getting the remaining hosts for this loop 12755 1727204100.46772: getting the next task for host managed-node1 12755 1727204100.46837: done getting next task for host managed-node1 12755 1727204100.46840: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 12755 1727204100.46844: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204100.46849: getting variables 12755 1727204100.46851: in VariableManager get_vars() 12755 1727204100.46969: Calling all_inventory to load vars for managed-node1 12755 1727204100.46973: Calling groups_inventory to load vars for managed-node1 12755 1727204100.46976: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204100.47049: Calling all_plugins_play to load vars for managed-node1 12755 1727204100.47055: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204100.47060: Calling groups_plugins_play to load vars for managed-node1 12755 1727204100.52013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204100.56506: done with get_vars() 12755 1727204100.56557: done getting variables 12755 1727204100.56633: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204100.56776: variable 'profile' from source: include params 12755 1727204100.56780: variable 'item' from source: include params 12755 1727204100.56855: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:55:00 -0400 (0:00:00.171) 0:00:25.804 ***** 12755 1727204100.56904: entering _queue_task() for managed-node1/assert 12755 1727204100.57271: worker is 1 (out of 1 available) 12755 1727204100.57287: exiting _queue_task() for managed-node1/assert 12755 1727204100.57425: done queuing things up, now waiting for results queue to drain 12755 1727204100.57427: waiting for pending results... 12755 1727204100.57792: running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0 12755 1727204100.57803: in run() - task 12b410aa-8751-72e9-1a19-000000000358 12755 1727204100.57825: variable 'ansible_search_path' from source: unknown 12755 1727204100.57829: variable 'ansible_search_path' from source: unknown 12755 1727204100.57878: calling self._execute() 12755 1727204100.57997: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204100.58007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204100.58016: variable 'omit' from source: magic vars 12755 1727204100.58484: variable 'ansible_distribution_major_version' from source: facts 12755 1727204100.58499: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204100.58506: variable 'omit' from source: magic vars 12755 1727204100.58566: variable 'omit' from source: magic vars 12755 1727204100.58695: variable 'profile' from source: include params 12755 1727204100.58699: variable 'item' from source: include params 12755 1727204100.58786: variable 'item' from source: include params 12755 1727204100.58810: variable 'omit' from source: magic vars 12755 1727204100.58868: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204100.58908: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204100.58933: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204100.58963: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204100.58979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204100.59017: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204100.59023: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204100.59029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204100.59164: Set connection var ansible_connection to ssh 12755 1727204100.59180: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204100.59183: Set connection var ansible_shell_type to sh 12755 1727204100.59308: Set connection var ansible_timeout to 10 12755 1727204100.59313: Set connection var ansible_shell_executable to /bin/sh 12755 1727204100.59317: Set connection var ansible_pipelining to False 12755 1727204100.59320: variable 'ansible_shell_executable' from source: unknown 12755 1727204100.59322: variable 'ansible_connection' from source: unknown 12755 1727204100.59325: variable 'ansible_module_compression' from source: unknown 12755 1727204100.59327: variable 'ansible_shell_type' from source: unknown 12755 1727204100.59330: variable 'ansible_shell_executable' from source: unknown 12755 1727204100.59332: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204100.59335: variable 'ansible_pipelining' from source: unknown 12755 1727204100.59338: variable 'ansible_timeout' from source: unknown 12755 1727204100.59340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204100.59526: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204100.59530: variable 'omit' from source: magic vars 12755 1727204100.59533: starting attempt loop 12755 1727204100.59536: running the handler 12755 1727204100.59637: variable 'lsr_net_profile_fingerprint' from source: set_fact 12755 1727204100.59641: Evaluated conditional (lsr_net_profile_fingerprint): True 12755 1727204100.59651: handler run complete 12755 1727204100.59673: attempt loop complete, returning result 12755 1727204100.59677: _execute() done 12755 1727204100.59680: dumping result to json 12755 1727204100.59683: done dumping result, returning 12755 1727204100.59704: done running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0 [12b410aa-8751-72e9-1a19-000000000358] 12755 1727204100.59709: sending task result for task 12b410aa-8751-72e9-1a19-000000000358 12755 1727204100.59821: done sending task result for task 12b410aa-8751-72e9-1a19-000000000358 12755 1727204100.59825: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12755 1727204100.59886: no more pending results, returning what we have 12755 1727204100.59893: results queue empty 12755 1727204100.59894: checking for any_errors_fatal 12755 1727204100.59907: done checking for any_errors_fatal 12755 1727204100.59908: checking for max_fail_percentage 12755 1727204100.59910: done checking for max_fail_percentage 12755 1727204100.59911: checking to see if all hosts have failed and the running result is not ok 12755 1727204100.59913: done checking to see if all hosts have failed 12755 1727204100.59913: getting the remaining hosts for this loop 12755 1727204100.59915: done getting the remaining hosts for this loop 12755 1727204100.59922: getting the next task for host managed-node1 12755 1727204100.59934: done getting next task for host managed-node1 12755 1727204100.59937: ^ task is: TASK: Include the task 'get_profile_stat.yml' 12755 1727204100.59941: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204100.59946: getting variables 12755 1727204100.59949: in VariableManager get_vars() 12755 1727204100.60428: Calling all_inventory to load vars for managed-node1 12755 1727204100.60432: Calling groups_inventory to load vars for managed-node1 12755 1727204100.60436: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204100.60447: Calling all_plugins_play to load vars for managed-node1 12755 1727204100.60451: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204100.60455: Calling groups_plugins_play to load vars for managed-node1 12755 1727204100.65602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204100.72114: done with get_vars() 12755 1727204100.72165: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:55:00 -0400 (0:00:00.153) 0:00:25.958 ***** 12755 1727204100.72287: entering _queue_task() for managed-node1/include_tasks 12755 1727204100.73377: worker is 1 (out of 1 available) 12755 1727204100.73395: exiting _queue_task() for managed-node1/include_tasks 12755 1727204100.73408: done queuing things up, now waiting for results queue to drain 12755 1727204100.73410: waiting for pending results... 12755 1727204100.73941: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 12755 1727204100.74256: in run() - task 12b410aa-8751-72e9-1a19-00000000035c 12755 1727204100.74260: variable 'ansible_search_path' from source: unknown 12755 1727204100.74263: variable 'ansible_search_path' from source: unknown 12755 1727204100.74267: calling self._execute() 12755 1727204100.74498: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204100.74515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204100.74598: variable 'omit' from source: magic vars 12755 1727204100.75546: variable 'ansible_distribution_major_version' from source: facts 12755 1727204100.75574: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204100.75626: _execute() done 12755 1727204100.75638: dumping result to json 12755 1727204100.75647: done dumping result, returning 12755 1727204100.75660: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [12b410aa-8751-72e9-1a19-00000000035c] 12755 1727204100.75833: sending task result for task 12b410aa-8751-72e9-1a19-00000000035c 12755 1727204100.76036: done sending task result for task 12b410aa-8751-72e9-1a19-00000000035c 12755 1727204100.76041: WORKER PROCESS EXITING 12755 1727204100.76073: no more pending results, returning what we have 12755 1727204100.76078: in VariableManager get_vars() 12755 1727204100.76267: Calling all_inventory to load vars for managed-node1 12755 1727204100.76271: Calling groups_inventory to load vars for managed-node1 12755 1727204100.76274: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204100.76291: Calling all_plugins_play to load vars for managed-node1 12755 1727204100.76295: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204100.76299: Calling groups_plugins_play to load vars for managed-node1 12755 1727204100.80113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204100.84202: done with get_vars() 12755 1727204100.84238: variable 'ansible_search_path' from source: unknown 12755 1727204100.84240: variable 'ansible_search_path' from source: unknown 12755 1727204100.84288: we have included files to process 12755 1727204100.84293: generating all_blocks data 12755 1727204100.84295: done generating all_blocks data 12755 1727204100.84307: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12755 1727204100.84308: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12755 1727204100.84312: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12755 1727204100.85570: done processing included file 12755 1727204100.85573: iterating over new_blocks loaded from include file 12755 1727204100.85575: in VariableManager get_vars() 12755 1727204100.85625: done with get_vars() 12755 1727204100.85628: filtering new block on tags 12755 1727204100.85665: done filtering new block on tags 12755 1727204100.85669: in VariableManager get_vars() 12755 1727204100.85709: done with get_vars() 12755 1727204100.85711: filtering new block on tags 12755 1727204100.85752: done filtering new block on tags 12755 1727204100.85756: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 12755 1727204100.85762: extending task lists for all hosts with included blocks 12755 1727204100.86040: done extending task lists 12755 1727204100.86042: done processing included files 12755 1727204100.86048: results queue empty 12755 1727204100.86049: checking for any_errors_fatal 12755 1727204100.86054: done checking for any_errors_fatal 12755 1727204100.86059: checking for max_fail_percentage 12755 1727204100.86061: done checking for max_fail_percentage 12755 1727204100.86062: checking to see if all hosts have failed and the running result is not ok 12755 1727204100.86063: done checking to see if all hosts have failed 12755 1727204100.86064: getting the remaining hosts for this loop 12755 1727204100.86066: done getting the remaining hosts for this loop 12755 1727204100.86069: getting the next task for host managed-node1 12755 1727204100.86074: done getting next task for host managed-node1 12755 1727204100.86077: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 12755 1727204100.86080: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204100.86083: getting variables 12755 1727204100.86084: in VariableManager get_vars() 12755 1727204100.86109: Calling all_inventory to load vars for managed-node1 12755 1727204100.86112: Calling groups_inventory to load vars for managed-node1 12755 1727204100.86116: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204100.86122: Calling all_plugins_play to load vars for managed-node1 12755 1727204100.86126: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204100.86129: Calling groups_plugins_play to load vars for managed-node1 12755 1727204100.88818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204100.90921: done with get_vars() 12755 1727204100.90975: done getting variables 12755 1727204100.91036: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:55:00 -0400 (0:00:00.187) 0:00:26.146 ***** 12755 1727204100.91073: entering _queue_task() for managed-node1/set_fact 12755 1727204100.91462: worker is 1 (out of 1 available) 12755 1727204100.91477: exiting _queue_task() for managed-node1/set_fact 12755 1727204100.91597: done queuing things up, now waiting for results queue to drain 12755 1727204100.91599: waiting for pending results... 12755 1727204100.91929: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 12755 1727204100.92101: in run() - task 12b410aa-8751-72e9-1a19-00000000062c 12755 1727204100.92106: variable 'ansible_search_path' from source: unknown 12755 1727204100.92109: variable 'ansible_search_path' from source: unknown 12755 1727204100.92113: calling self._execute() 12755 1727204100.92225: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204100.92244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204100.92259: variable 'omit' from source: magic vars 12755 1727204100.92743: variable 'ansible_distribution_major_version' from source: facts 12755 1727204100.92765: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204100.92783: variable 'omit' from source: magic vars 12755 1727204100.92892: variable 'omit' from source: magic vars 12755 1727204100.92907: variable 'omit' from source: magic vars 12755 1727204100.92968: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204100.93022: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204100.93052: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204100.93080: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204100.93197: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204100.93201: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204100.93204: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204100.93207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204100.93308: Set connection var ansible_connection to ssh 12755 1727204100.93329: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204100.93337: Set connection var ansible_shell_type to sh 12755 1727204100.93357: Set connection var ansible_timeout to 10 12755 1727204100.93369: Set connection var ansible_shell_executable to /bin/sh 12755 1727204100.93380: Set connection var ansible_pipelining to False 12755 1727204100.93412: variable 'ansible_shell_executable' from source: unknown 12755 1727204100.93428: variable 'ansible_connection' from source: unknown 12755 1727204100.93447: variable 'ansible_module_compression' from source: unknown 12755 1727204100.93595: variable 'ansible_shell_type' from source: unknown 12755 1727204100.93599: variable 'ansible_shell_executable' from source: unknown 12755 1727204100.93602: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204100.93604: variable 'ansible_pipelining' from source: unknown 12755 1727204100.93606: variable 'ansible_timeout' from source: unknown 12755 1727204100.93608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204100.93672: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204100.93703: variable 'omit' from source: magic vars 12755 1727204100.93714: starting attempt loop 12755 1727204100.93729: running the handler 12755 1727204100.93750: handler run complete 12755 1727204100.93766: attempt loop complete, returning result 12755 1727204100.93774: _execute() done 12755 1727204100.93782: dumping result to json 12755 1727204100.93793: done dumping result, returning 12755 1727204100.93806: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [12b410aa-8751-72e9-1a19-00000000062c] 12755 1727204100.93816: sending task result for task 12b410aa-8751-72e9-1a19-00000000062c ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 12755 1727204100.94006: no more pending results, returning what we have 12755 1727204100.94009: results queue empty 12755 1727204100.94011: checking for any_errors_fatal 12755 1727204100.94013: done checking for any_errors_fatal 12755 1727204100.94014: checking for max_fail_percentage 12755 1727204100.94016: done checking for max_fail_percentage 12755 1727204100.94019: checking to see if all hosts have failed and the running result is not ok 12755 1727204100.94020: done checking to see if all hosts have failed 12755 1727204100.94021: getting the remaining hosts for this loop 12755 1727204100.94023: done getting the remaining hosts for this loop 12755 1727204100.94029: getting the next task for host managed-node1 12755 1727204100.94037: done getting next task for host managed-node1 12755 1727204100.94040: ^ task is: TASK: Stat profile file 12755 1727204100.94045: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204100.94050: getting variables 12755 1727204100.94053: in VariableManager get_vars() 12755 1727204100.94200: Calling all_inventory to load vars for managed-node1 12755 1727204100.94205: Calling groups_inventory to load vars for managed-node1 12755 1727204100.94208: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204100.94222: Calling all_plugins_play to load vars for managed-node1 12755 1727204100.94225: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204100.94229: Calling groups_plugins_play to load vars for managed-node1 12755 1727204100.94790: done sending task result for task 12b410aa-8751-72e9-1a19-00000000062c 12755 1727204100.94793: WORKER PROCESS EXITING 12755 1727204100.96337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204100.97945: done with get_vars() 12755 1727204100.97975: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:55:00 -0400 (0:00:00.069) 0:00:26.216 ***** 12755 1727204100.98063: entering _queue_task() for managed-node1/stat 12755 1727204100.98346: worker is 1 (out of 1 available) 12755 1727204100.98364: exiting _queue_task() for managed-node1/stat 12755 1727204100.98379: done queuing things up, now waiting for results queue to drain 12755 1727204100.98380: waiting for pending results... 12755 1727204100.98593: running TaskExecutor() for managed-node1/TASK: Stat profile file 12755 1727204100.98808: in run() - task 12b410aa-8751-72e9-1a19-00000000062d 12755 1727204100.98812: variable 'ansible_search_path' from source: unknown 12755 1727204100.98816: variable 'ansible_search_path' from source: unknown 12755 1727204100.98819: calling self._execute() 12755 1727204100.98956: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204100.98960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204100.98964: variable 'omit' from source: magic vars 12755 1727204100.99855: variable 'ansible_distribution_major_version' from source: facts 12755 1727204100.99859: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204100.99862: variable 'omit' from source: magic vars 12755 1727204100.99864: variable 'omit' from source: magic vars 12755 1727204101.00096: variable 'profile' from source: include params 12755 1727204101.00109: variable 'item' from source: include params 12755 1727204101.00290: variable 'item' from source: include params 12755 1727204101.00296: variable 'omit' from source: magic vars 12755 1727204101.00328: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204101.00377: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204101.00423: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204101.00508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204101.00511: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204101.00520: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204101.00535: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204101.00543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204101.00697: Set connection var ansible_connection to ssh 12755 1727204101.00713: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204101.00728: Set connection var ansible_shell_type to sh 12755 1727204101.00760: Set connection var ansible_timeout to 10 12755 1727204101.00773: Set connection var ansible_shell_executable to /bin/sh 12755 1727204101.00787: Set connection var ansible_pipelining to False 12755 1727204101.00829: variable 'ansible_shell_executable' from source: unknown 12755 1727204101.00861: variable 'ansible_connection' from source: unknown 12755 1727204101.00866: variable 'ansible_module_compression' from source: unknown 12755 1727204101.00870: variable 'ansible_shell_type' from source: unknown 12755 1727204101.00873: variable 'ansible_shell_executable' from source: unknown 12755 1727204101.00876: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204101.00884: variable 'ansible_pipelining' from source: unknown 12755 1727204101.00887: variable 'ansible_timeout' from source: unknown 12755 1727204101.00895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204101.01093: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204101.01106: variable 'omit' from source: magic vars 12755 1727204101.01112: starting attempt loop 12755 1727204101.01115: running the handler 12755 1727204101.01142: _low_level_execute_command(): starting 12755 1727204101.01146: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204101.01678: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204101.01688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204101.01696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204101.01727: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 12755 1727204101.01731: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204101.01733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204101.01787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204101.01799: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204101.01868: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204101.03761: stdout chunk (state=3): >>>/root <<< 12755 1727204101.03913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204101.04031: stderr chunk (state=3): >>><<< 12755 1727204101.04035: stdout chunk (state=3): >>><<< 12755 1727204101.04052: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204101.04075: _low_level_execute_command(): starting 12755 1727204101.04090: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204101.0406063-14484-137951602450985 `" && echo ansible-tmp-1727204101.0406063-14484-137951602450985="` echo /root/.ansible/tmp/ansible-tmp-1727204101.0406063-14484-137951602450985 `" ) && sleep 0' 12755 1727204101.04846: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204101.04900: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204101.04968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204101.05042: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204101.05088: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204101.05132: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204101.05329: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204101.07385: stdout chunk (state=3): >>>ansible-tmp-1727204101.0406063-14484-137951602450985=/root/.ansible/tmp/ansible-tmp-1727204101.0406063-14484-137951602450985 <<< 12755 1727204101.07509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204101.07610: stderr chunk (state=3): >>><<< 12755 1727204101.07613: stdout chunk (state=3): >>><<< 12755 1727204101.07649: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204101.0406063-14484-137951602450985=/root/.ansible/tmp/ansible-tmp-1727204101.0406063-14484-137951602450985 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204101.07722: variable 'ansible_module_compression' from source: unknown 12755 1727204101.07777: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12755 1727204101.07812: variable 'ansible_facts' from source: unknown 12755 1727204101.07866: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204101.0406063-14484-137951602450985/AnsiballZ_stat.py 12755 1727204101.08032: Sending initial data 12755 1727204101.08036: Sent initial data (153 bytes) 12755 1727204101.08603: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204101.08606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204101.08609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12755 1727204101.08614: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204101.08670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204101.08673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204101.08765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204101.10510: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204101.10597: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204101.10637: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpigy_jc6r /root/.ansible/tmp/ansible-tmp-1727204101.0406063-14484-137951602450985/AnsiballZ_stat.py <<< 12755 1727204101.10645: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204101.0406063-14484-137951602450985/AnsiballZ_stat.py" <<< 12755 1727204101.10677: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpigy_jc6r" to remote "/root/.ansible/tmp/ansible-tmp-1727204101.0406063-14484-137951602450985/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204101.0406063-14484-137951602450985/AnsiballZ_stat.py" <<< 12755 1727204101.11476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204101.11541: stderr chunk (state=3): >>><<< 12755 1727204101.11545: stdout chunk (state=3): >>><<< 12755 1727204101.11565: done transferring module to remote 12755 1727204101.11575: _low_level_execute_command(): starting 12755 1727204101.11581: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204101.0406063-14484-137951602450985/ /root/.ansible/tmp/ansible-tmp-1727204101.0406063-14484-137951602450985/AnsiballZ_stat.py && sleep 0' 12755 1727204101.12043: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204101.12046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204101.12050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204101.12052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204101.12107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204101.12114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204101.12156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204101.14107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204101.14161: stderr chunk (state=3): >>><<< 12755 1727204101.14165: stdout chunk (state=3): >>><<< 12755 1727204101.14181: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204101.14186: _low_level_execute_command(): starting 12755 1727204101.14191: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204101.0406063-14484-137951602450985/AnsiballZ_stat.py && sleep 0' 12755 1727204101.14677: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204101.14681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204101.14683: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204101.14686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204101.14745: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204101.14749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204101.14752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204101.14860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204101.33237: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12755 1727204101.34733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204101.34803: stderr chunk (state=3): >>><<< 12755 1727204101.34807: stdout chunk (state=3): >>><<< 12755 1727204101.34825: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204101.34854: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204101.0406063-14484-137951602450985/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204101.34868: _low_level_execute_command(): starting 12755 1727204101.34875: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204101.0406063-14484-137951602450985/ > /dev/null 2>&1 && sleep 0' 12755 1727204101.35396: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204101.35400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204101.35402: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 12755 1727204101.35405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204101.35408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204101.35465: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204101.35472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204101.35519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204101.37510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204101.37566: stderr chunk (state=3): >>><<< 12755 1727204101.37569: stdout chunk (state=3): >>><<< 12755 1727204101.37584: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204101.37595: handler run complete 12755 1727204101.37624: attempt loop complete, returning result 12755 1727204101.37628: _execute() done 12755 1727204101.37631: dumping result to json 12755 1727204101.37639: done dumping result, returning 12755 1727204101.37647: done running TaskExecutor() for managed-node1/TASK: Stat profile file [12b410aa-8751-72e9-1a19-00000000062d] 12755 1727204101.37652: sending task result for task 12b410aa-8751-72e9-1a19-00000000062d 12755 1727204101.37761: done sending task result for task 12b410aa-8751-72e9-1a19-00000000062d 12755 1727204101.37765: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 12755 1727204101.37844: no more pending results, returning what we have 12755 1727204101.37847: results queue empty 12755 1727204101.37848: checking for any_errors_fatal 12755 1727204101.37858: done checking for any_errors_fatal 12755 1727204101.37859: checking for max_fail_percentage 12755 1727204101.37861: done checking for max_fail_percentage 12755 1727204101.37862: checking to see if all hosts have failed and the running result is not ok 12755 1727204101.37863: done checking to see if all hosts have failed 12755 1727204101.37864: getting the remaining hosts for this loop 12755 1727204101.37865: done getting the remaining hosts for this loop 12755 1727204101.37870: getting the next task for host managed-node1 12755 1727204101.37897: done getting next task for host managed-node1 12755 1727204101.37900: ^ task is: TASK: Set NM profile exist flag based on the profile files 12755 1727204101.37905: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204101.37909: getting variables 12755 1727204101.37911: in VariableManager get_vars() 12755 1727204101.37981: Calling all_inventory to load vars for managed-node1 12755 1727204101.37985: Calling groups_inventory to load vars for managed-node1 12755 1727204101.37988: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204101.38011: Calling all_plugins_play to load vars for managed-node1 12755 1727204101.38015: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204101.38019: Calling groups_plugins_play to load vars for managed-node1 12755 1727204101.39553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204101.41502: done with get_vars() 12755 1727204101.41537: done getting variables 12755 1727204101.41609: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:55:01 -0400 (0:00:00.435) 0:00:26.652 ***** 12755 1727204101.41638: entering _queue_task() for managed-node1/set_fact 12755 1727204101.41944: worker is 1 (out of 1 available) 12755 1727204101.41959: exiting _queue_task() for managed-node1/set_fact 12755 1727204101.41974: done queuing things up, now waiting for results queue to drain 12755 1727204101.41976: waiting for pending results... 12755 1727204101.42190: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 12755 1727204101.42282: in run() - task 12b410aa-8751-72e9-1a19-00000000062e 12755 1727204101.42314: variable 'ansible_search_path' from source: unknown 12755 1727204101.42321: variable 'ansible_search_path' from source: unknown 12755 1727204101.42352: calling self._execute() 12755 1727204101.42452: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204101.42460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204101.42467: variable 'omit' from source: magic vars 12755 1727204101.42813: variable 'ansible_distribution_major_version' from source: facts 12755 1727204101.42828: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204101.42939: variable 'profile_stat' from source: set_fact 12755 1727204101.42951: Evaluated conditional (profile_stat.stat.exists): False 12755 1727204101.42954: when evaluation is False, skipping this task 12755 1727204101.42957: _execute() done 12755 1727204101.42962: dumping result to json 12755 1727204101.42965: done dumping result, returning 12755 1727204101.42973: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [12b410aa-8751-72e9-1a19-00000000062e] 12755 1727204101.42981: sending task result for task 12b410aa-8751-72e9-1a19-00000000062e 12755 1727204101.43074: done sending task result for task 12b410aa-8751-72e9-1a19-00000000062e 12755 1727204101.43078: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12755 1727204101.43139: no more pending results, returning what we have 12755 1727204101.43143: results queue empty 12755 1727204101.43145: checking for any_errors_fatal 12755 1727204101.43155: done checking for any_errors_fatal 12755 1727204101.43156: checking for max_fail_percentage 12755 1727204101.43157: done checking for max_fail_percentage 12755 1727204101.43158: checking to see if all hosts have failed and the running result is not ok 12755 1727204101.43159: done checking to see if all hosts have failed 12755 1727204101.43160: getting the remaining hosts for this loop 12755 1727204101.43162: done getting the remaining hosts for this loop 12755 1727204101.43167: getting the next task for host managed-node1 12755 1727204101.43176: done getting next task for host managed-node1 12755 1727204101.43179: ^ task is: TASK: Get NM profile info 12755 1727204101.43183: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204101.43187: getting variables 12755 1727204101.43188: in VariableManager get_vars() 12755 1727204101.43246: Calling all_inventory to load vars for managed-node1 12755 1727204101.43249: Calling groups_inventory to load vars for managed-node1 12755 1727204101.43252: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204101.43265: Calling all_plugins_play to load vars for managed-node1 12755 1727204101.43268: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204101.43271: Calling groups_plugins_play to load vars for managed-node1 12755 1727204101.44610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204101.46754: done with get_vars() 12755 1727204101.46806: done getting variables 12755 1727204101.46890: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:55:01 -0400 (0:00:00.052) 0:00:26.705 ***** 12755 1727204101.46936: entering _queue_task() for managed-node1/shell 12755 1727204101.47347: worker is 1 (out of 1 available) 12755 1727204101.47362: exiting _queue_task() for managed-node1/shell 12755 1727204101.47382: done queuing things up, now waiting for results queue to drain 12755 1727204101.47384: waiting for pending results... 12755 1727204101.47937: running TaskExecutor() for managed-node1/TASK: Get NM profile info 12755 1727204101.48573: in run() - task 12b410aa-8751-72e9-1a19-00000000062f 12755 1727204101.48585: variable 'ansible_search_path' from source: unknown 12755 1727204101.48590: variable 'ansible_search_path' from source: unknown 12755 1727204101.48594: calling self._execute() 12755 1727204101.48623: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204101.48627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204101.48631: variable 'omit' from source: magic vars 12755 1727204101.49678: variable 'ansible_distribution_major_version' from source: facts 12755 1727204101.49739: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204101.49758: variable 'omit' from source: magic vars 12755 1727204101.49813: variable 'omit' from source: magic vars 12755 1727204101.50145: variable 'profile' from source: include params 12755 1727204101.50149: variable 'item' from source: include params 12755 1727204101.50239: variable 'item' from source: include params 12755 1727204101.50308: variable 'omit' from source: magic vars 12755 1727204101.50314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204101.50379: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204101.50396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204101.50494: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204101.50498: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204101.50513: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204101.50520: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204101.50524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204101.50666: Set connection var ansible_connection to ssh 12755 1727204101.50675: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204101.50678: Set connection var ansible_shell_type to sh 12755 1727204101.50733: Set connection var ansible_timeout to 10 12755 1727204101.50736: Set connection var ansible_shell_executable to /bin/sh 12755 1727204101.50739: Set connection var ansible_pipelining to False 12755 1727204101.50755: variable 'ansible_shell_executable' from source: unknown 12755 1727204101.50796: variable 'ansible_connection' from source: unknown 12755 1727204101.50799: variable 'ansible_module_compression' from source: unknown 12755 1727204101.50802: variable 'ansible_shell_type' from source: unknown 12755 1727204101.50804: variable 'ansible_shell_executable' from source: unknown 12755 1727204101.50806: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204101.50809: variable 'ansible_pipelining' from source: unknown 12755 1727204101.50823: variable 'ansible_timeout' from source: unknown 12755 1727204101.50825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204101.51046: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204101.51050: variable 'omit' from source: magic vars 12755 1727204101.51053: starting attempt loop 12755 1727204101.51055: running the handler 12755 1727204101.51058: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204101.51084: _low_level_execute_command(): starting 12755 1727204101.51088: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204101.52144: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204101.52148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204101.52151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204101.52173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204101.52266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204101.54078: stdout chunk (state=3): >>>/root <<< 12755 1727204101.54190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204101.54248: stderr chunk (state=3): >>><<< 12755 1727204101.54251: stdout chunk (state=3): >>><<< 12755 1727204101.54275: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204101.54292: _low_level_execute_command(): starting 12755 1727204101.54299: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204101.5427608-14500-156824913661098 `" && echo ansible-tmp-1727204101.5427608-14500-156824913661098="` echo /root/.ansible/tmp/ansible-tmp-1727204101.5427608-14500-156824913661098 `" ) && sleep 0' 12755 1727204101.54761: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204101.54772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204101.54775: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204101.54777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204101.54825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204101.54828: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204101.54884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204101.57045: stdout chunk (state=3): >>>ansible-tmp-1727204101.5427608-14500-156824913661098=/root/.ansible/tmp/ansible-tmp-1727204101.5427608-14500-156824913661098 <<< 12755 1727204101.57340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204101.57344: stdout chunk (state=3): >>><<< 12755 1727204101.57346: stderr chunk (state=3): >>><<< 12755 1727204101.57495: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204101.5427608-14500-156824913661098=/root/.ansible/tmp/ansible-tmp-1727204101.5427608-14500-156824913661098 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204101.57499: variable 'ansible_module_compression' from source: unknown 12755 1727204101.57501: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12755 1727204101.57534: variable 'ansible_facts' from source: unknown 12755 1727204101.57636: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204101.5427608-14500-156824913661098/AnsiballZ_command.py 12755 1727204101.57867: Sending initial data 12755 1727204101.57871: Sent initial data (156 bytes) 12755 1727204101.58622: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204101.58700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204101.58728: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204101.58763: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204101.58845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204101.60625: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204101.60657: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204101.60727: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmp3wfm9zcr /root/.ansible/tmp/ansible-tmp-1727204101.5427608-14500-156824913661098/AnsiballZ_command.py <<< 12755 1727204101.60732: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204101.5427608-14500-156824913661098/AnsiballZ_command.py" <<< 12755 1727204101.60767: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmp3wfm9zcr" to remote "/root/.ansible/tmp/ansible-tmp-1727204101.5427608-14500-156824913661098/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204101.5427608-14500-156824913661098/AnsiballZ_command.py" <<< 12755 1727204101.61986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204101.62093: stderr chunk (state=3): >>><<< 12755 1727204101.62105: stdout chunk (state=3): >>><<< 12755 1727204101.62108: done transferring module to remote 12755 1727204101.62110: _low_level_execute_command(): starting 12755 1727204101.62113: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204101.5427608-14500-156824913661098/ /root/.ansible/tmp/ansible-tmp-1727204101.5427608-14500-156824913661098/AnsiballZ_command.py && sleep 0' 12755 1727204101.62827: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204101.62844: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204101.62875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204101.62983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204101.63024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204101.63042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204101.63063: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204101.63203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204101.65257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204101.65261: stdout chunk (state=3): >>><<< 12755 1727204101.65264: stderr chunk (state=3): >>><<< 12755 1727204101.65283: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204101.65294: _low_level_execute_command(): starting 12755 1727204101.65305: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204101.5427608-14500-156824913661098/AnsiballZ_command.py && sleep 0' 12755 1727204101.66051: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204101.66188: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204101.66213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204101.66250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204101.66349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204101.87022: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-24 14:55:01.845562", "end": "2024-09-24 14:55:01.869317", "delta": "0:00:00.023755", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12755 1727204101.88859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204101.88927: stderr chunk (state=3): >>><<< 12755 1727204101.88931: stdout chunk (state=3): >>><<< 12755 1727204101.88951: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-24 14:55:01.845562", "end": "2024-09-24 14:55:01.869317", "delta": "0:00:00.023755", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204101.88990: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204101.5427608-14500-156824913661098/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204101.88999: _low_level_execute_command(): starting 12755 1727204101.89008: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204101.5427608-14500-156824913661098/ > /dev/null 2>&1 && sleep 0' 12755 1727204101.89486: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204101.89493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204101.89527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204101.89530: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204101.89538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204101.89604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204101.89607: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204101.89612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204101.89664: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204101.91623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204101.91672: stderr chunk (state=3): >>><<< 12755 1727204101.91676: stdout chunk (state=3): >>><<< 12755 1727204101.91693: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204101.91701: handler run complete 12755 1727204101.91731: Evaluated conditional (False): False 12755 1727204101.91743: attempt loop complete, returning result 12755 1727204101.91746: _execute() done 12755 1727204101.91749: dumping result to json 12755 1727204101.91756: done dumping result, returning 12755 1727204101.91764: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [12b410aa-8751-72e9-1a19-00000000062f] 12755 1727204101.91770: sending task result for task 12b410aa-8751-72e9-1a19-00000000062f 12755 1727204101.91877: done sending task result for task 12b410aa-8751-72e9-1a19-00000000062f 12755 1727204101.91879: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.023755", "end": "2024-09-24 14:55:01.869317", "rc": 0, "start": "2024-09-24 14:55:01.845562" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 12755 1727204101.91976: no more pending results, returning what we have 12755 1727204101.91979: results queue empty 12755 1727204101.91981: checking for any_errors_fatal 12755 1727204101.91987: done checking for any_errors_fatal 12755 1727204101.91988: checking for max_fail_percentage 12755 1727204101.92000: done checking for max_fail_percentage 12755 1727204101.92001: checking to see if all hosts have failed and the running result is not ok 12755 1727204101.92002: done checking to see if all hosts have failed 12755 1727204101.92003: getting the remaining hosts for this loop 12755 1727204101.92005: done getting the remaining hosts for this loop 12755 1727204101.92009: getting the next task for host managed-node1 12755 1727204101.92020: done getting next task for host managed-node1 12755 1727204101.92023: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12755 1727204101.92028: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204101.92031: getting variables 12755 1727204101.92033: in VariableManager get_vars() 12755 1727204101.92083: Calling all_inventory to load vars for managed-node1 12755 1727204101.92087: Calling groups_inventory to load vars for managed-node1 12755 1727204101.92092: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204101.92113: Calling all_plugins_play to load vars for managed-node1 12755 1727204101.92119: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204101.92124: Calling groups_plugins_play to load vars for managed-node1 12755 1727204101.93559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204101.96245: done with get_vars() 12755 1727204101.96297: done getting variables 12755 1727204101.96387: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:55:01 -0400 (0:00:00.494) 0:00:27.200 ***** 12755 1727204101.96430: entering _queue_task() for managed-node1/set_fact 12755 1727204101.96859: worker is 1 (out of 1 available) 12755 1727204101.96875: exiting _queue_task() for managed-node1/set_fact 12755 1727204101.96893: done queuing things up, now waiting for results queue to drain 12755 1727204101.96895: waiting for pending results... 12755 1727204101.97218: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12755 1727204101.97343: in run() - task 12b410aa-8751-72e9-1a19-000000000630 12755 1727204101.97348: variable 'ansible_search_path' from source: unknown 12755 1727204101.97351: variable 'ansible_search_path' from source: unknown 12755 1727204101.97448: calling self._execute() 12755 1727204101.97526: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204101.97533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204101.97578: variable 'omit' from source: magic vars 12755 1727204101.97988: variable 'ansible_distribution_major_version' from source: facts 12755 1727204101.98002: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204101.98186: variable 'nm_profile_exists' from source: set_fact 12755 1727204101.98203: Evaluated conditional (nm_profile_exists.rc == 0): True 12755 1727204101.98209: variable 'omit' from source: magic vars 12755 1727204101.98254: variable 'omit' from source: magic vars 12755 1727204101.98283: variable 'omit' from source: magic vars 12755 1727204101.98327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204101.98361: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204101.98380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204101.98401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204101.98423: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204101.98450: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204101.98454: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204101.98457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204101.98545: Set connection var ansible_connection to ssh 12755 1727204101.98551: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204101.98554: Set connection var ansible_shell_type to sh 12755 1727204101.98567: Set connection var ansible_timeout to 10 12755 1727204101.98572: Set connection var ansible_shell_executable to /bin/sh 12755 1727204101.98579: Set connection var ansible_pipelining to False 12755 1727204101.98602: variable 'ansible_shell_executable' from source: unknown 12755 1727204101.98605: variable 'ansible_connection' from source: unknown 12755 1727204101.98608: variable 'ansible_module_compression' from source: unknown 12755 1727204101.98612: variable 'ansible_shell_type' from source: unknown 12755 1727204101.98615: variable 'ansible_shell_executable' from source: unknown 12755 1727204101.98620: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204101.98624: variable 'ansible_pipelining' from source: unknown 12755 1727204101.98626: variable 'ansible_timeout' from source: unknown 12755 1727204101.98637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204101.98858: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204101.98862: variable 'omit' from source: magic vars 12755 1727204101.98865: starting attempt loop 12755 1727204101.98867: running the handler 12755 1727204101.98870: handler run complete 12755 1727204101.98872: attempt loop complete, returning result 12755 1727204101.98874: _execute() done 12755 1727204101.98876: dumping result to json 12755 1727204101.98879: done dumping result, returning 12755 1727204101.98881: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12b410aa-8751-72e9-1a19-000000000630] 12755 1727204101.98884: sending task result for task 12b410aa-8751-72e9-1a19-000000000630 12755 1727204101.98955: done sending task result for task 12b410aa-8751-72e9-1a19-000000000630 12755 1727204101.98959: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 12755 1727204101.99047: no more pending results, returning what we have 12755 1727204101.99051: results queue empty 12755 1727204101.99052: checking for any_errors_fatal 12755 1727204101.99062: done checking for any_errors_fatal 12755 1727204101.99063: checking for max_fail_percentage 12755 1727204101.99065: done checking for max_fail_percentage 12755 1727204101.99065: checking to see if all hosts have failed and the running result is not ok 12755 1727204101.99066: done checking to see if all hosts have failed 12755 1727204101.99067: getting the remaining hosts for this loop 12755 1727204101.99069: done getting the remaining hosts for this loop 12755 1727204101.99073: getting the next task for host managed-node1 12755 1727204101.99083: done getting next task for host managed-node1 12755 1727204101.99085: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 12755 1727204101.99218: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204101.99224: getting variables 12755 1727204101.99226: in VariableManager get_vars() 12755 1727204101.99286: Calling all_inventory to load vars for managed-node1 12755 1727204101.99296: Calling groups_inventory to load vars for managed-node1 12755 1727204101.99299: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204101.99311: Calling all_plugins_play to load vars for managed-node1 12755 1727204101.99315: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204101.99321: Calling groups_plugins_play to load vars for managed-node1 12755 1727204102.01842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204102.04512: done with get_vars() 12755 1727204102.04551: done getting variables 12755 1727204102.04606: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204102.04711: variable 'profile' from source: include params 12755 1727204102.04715: variable 'item' from source: include params 12755 1727204102.04769: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:55:02 -0400 (0:00:00.083) 0:00:27.283 ***** 12755 1727204102.04805: entering _queue_task() for managed-node1/command 12755 1727204102.05080: worker is 1 (out of 1 available) 12755 1727204102.05098: exiting _queue_task() for managed-node1/command 12755 1727204102.05113: done queuing things up, now waiting for results queue to drain 12755 1727204102.05115: waiting for pending results... 12755 1727204102.05476: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0.0 12755 1727204102.05481: in run() - task 12b410aa-8751-72e9-1a19-000000000632 12755 1727204102.05484: variable 'ansible_search_path' from source: unknown 12755 1727204102.05488: variable 'ansible_search_path' from source: unknown 12755 1727204102.05519: calling self._execute() 12755 1727204102.05623: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204102.05627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204102.05641: variable 'omit' from source: magic vars 12755 1727204102.06125: variable 'ansible_distribution_major_version' from source: facts 12755 1727204102.06135: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204102.06373: variable 'profile_stat' from source: set_fact 12755 1727204102.06376: Evaluated conditional (profile_stat.stat.exists): False 12755 1727204102.06379: when evaluation is False, skipping this task 12755 1727204102.06382: _execute() done 12755 1727204102.06384: dumping result to json 12755 1727204102.06387: done dumping result, returning 12755 1727204102.06391: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [12b410aa-8751-72e9-1a19-000000000632] 12755 1727204102.06394: sending task result for task 12b410aa-8751-72e9-1a19-000000000632 12755 1727204102.06589: done sending task result for task 12b410aa-8751-72e9-1a19-000000000632 12755 1727204102.06595: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12755 1727204102.06663: no more pending results, returning what we have 12755 1727204102.06669: results queue empty 12755 1727204102.06670: checking for any_errors_fatal 12755 1727204102.06675: done checking for any_errors_fatal 12755 1727204102.06676: checking for max_fail_percentage 12755 1727204102.06678: done checking for max_fail_percentage 12755 1727204102.06679: checking to see if all hosts have failed and the running result is not ok 12755 1727204102.06680: done checking to see if all hosts have failed 12755 1727204102.06681: getting the remaining hosts for this loop 12755 1727204102.06683: done getting the remaining hosts for this loop 12755 1727204102.06688: getting the next task for host managed-node1 12755 1727204102.06697: done getting next task for host managed-node1 12755 1727204102.06700: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 12755 1727204102.06704: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204102.06709: getting variables 12755 1727204102.06711: in VariableManager get_vars() 12755 1727204102.07008: Calling all_inventory to load vars for managed-node1 12755 1727204102.07012: Calling groups_inventory to load vars for managed-node1 12755 1727204102.07015: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204102.07030: Calling all_plugins_play to load vars for managed-node1 12755 1727204102.07033: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204102.07036: Calling groups_plugins_play to load vars for managed-node1 12755 1727204102.10318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204102.13397: done with get_vars() 12755 1727204102.13452: done getting variables 12755 1727204102.13528: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204102.13794: variable 'profile' from source: include params 12755 1727204102.13800: variable 'item' from source: include params 12755 1727204102.13911: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:55:02 -0400 (0:00:00.091) 0:00:27.375 ***** 12755 1727204102.13977: entering _queue_task() for managed-node1/set_fact 12755 1727204102.14727: worker is 1 (out of 1 available) 12755 1727204102.14741: exiting _queue_task() for managed-node1/set_fact 12755 1727204102.14755: done queuing things up, now waiting for results queue to drain 12755 1727204102.14757: waiting for pending results... 12755 1727204102.15151: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 12755 1727204102.15158: in run() - task 12b410aa-8751-72e9-1a19-000000000633 12755 1727204102.15161: variable 'ansible_search_path' from source: unknown 12755 1727204102.15163: variable 'ansible_search_path' from source: unknown 12755 1727204102.15167: calling self._execute() 12755 1727204102.15458: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204102.15463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204102.15466: variable 'omit' from source: magic vars 12755 1727204102.16121: variable 'ansible_distribution_major_version' from source: facts 12755 1727204102.16135: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204102.16301: variable 'profile_stat' from source: set_fact 12755 1727204102.16436: Evaluated conditional (profile_stat.stat.exists): False 12755 1727204102.16440: when evaluation is False, skipping this task 12755 1727204102.16443: _execute() done 12755 1727204102.16447: dumping result to json 12755 1727204102.16455: done dumping result, returning 12755 1727204102.16465: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [12b410aa-8751-72e9-1a19-000000000633] 12755 1727204102.16468: sending task result for task 12b410aa-8751-72e9-1a19-000000000633 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12755 1727204102.16743: no more pending results, returning what we have 12755 1727204102.16747: results queue empty 12755 1727204102.16750: checking for any_errors_fatal 12755 1727204102.16758: done checking for any_errors_fatal 12755 1727204102.16759: checking for max_fail_percentage 12755 1727204102.16761: done checking for max_fail_percentage 12755 1727204102.16762: checking to see if all hosts have failed and the running result is not ok 12755 1727204102.16764: done checking to see if all hosts have failed 12755 1727204102.16765: getting the remaining hosts for this loop 12755 1727204102.16766: done getting the remaining hosts for this loop 12755 1727204102.16772: getting the next task for host managed-node1 12755 1727204102.16780: done getting next task for host managed-node1 12755 1727204102.16783: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 12755 1727204102.16792: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204102.16797: getting variables 12755 1727204102.16801: in VariableManager get_vars() 12755 1727204102.16869: Calling all_inventory to load vars for managed-node1 12755 1727204102.16873: Calling groups_inventory to load vars for managed-node1 12755 1727204102.16876: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204102.17017: Calling all_plugins_play to load vars for managed-node1 12755 1727204102.17023: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204102.17077: done sending task result for task 12b410aa-8751-72e9-1a19-000000000633 12755 1727204102.17081: WORKER PROCESS EXITING 12755 1727204102.17086: Calling groups_plugins_play to load vars for managed-node1 12755 1727204102.19996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204102.25708: done with get_vars() 12755 1727204102.25759: done getting variables 12755 1727204102.25840: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204102.25983: variable 'profile' from source: include params 12755 1727204102.25988: variable 'item' from source: include params 12755 1727204102.26069: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:55:02 -0400 (0:00:00.121) 0:00:27.496 ***** 12755 1727204102.26113: entering _queue_task() for managed-node1/command 12755 1727204102.26522: worker is 1 (out of 1 available) 12755 1727204102.26537: exiting _queue_task() for managed-node1/command 12755 1727204102.26665: done queuing things up, now waiting for results queue to drain 12755 1727204102.26667: waiting for pending results... 12755 1727204102.26897: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0.0 12755 1727204102.27063: in run() - task 12b410aa-8751-72e9-1a19-000000000634 12755 1727204102.27258: variable 'ansible_search_path' from source: unknown 12755 1727204102.27263: variable 'ansible_search_path' from source: unknown 12755 1727204102.27267: calling self._execute() 12755 1727204102.27271: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204102.27274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204102.27277: variable 'omit' from source: magic vars 12755 1727204102.27808: variable 'ansible_distribution_major_version' from source: facts 12755 1727204102.27812: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204102.28024: variable 'profile_stat' from source: set_fact 12755 1727204102.28041: Evaluated conditional (profile_stat.stat.exists): False 12755 1727204102.28045: when evaluation is False, skipping this task 12755 1727204102.28048: _execute() done 12755 1727204102.28051: dumping result to json 12755 1727204102.28057: done dumping result, returning 12755 1727204102.28066: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0.0 [12b410aa-8751-72e9-1a19-000000000634] 12755 1727204102.28072: sending task result for task 12b410aa-8751-72e9-1a19-000000000634 12755 1727204102.28314: done sending task result for task 12b410aa-8751-72e9-1a19-000000000634 12755 1727204102.28322: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12755 1727204102.28388: no more pending results, returning what we have 12755 1727204102.28406: results queue empty 12755 1727204102.28408: checking for any_errors_fatal 12755 1727204102.28417: done checking for any_errors_fatal 12755 1727204102.28418: checking for max_fail_percentage 12755 1727204102.28420: done checking for max_fail_percentage 12755 1727204102.28421: checking to see if all hosts have failed and the running result is not ok 12755 1727204102.28423: done checking to see if all hosts have failed 12755 1727204102.28424: getting the remaining hosts for this loop 12755 1727204102.28426: done getting the remaining hosts for this loop 12755 1727204102.28432: getting the next task for host managed-node1 12755 1727204102.28440: done getting next task for host managed-node1 12755 1727204102.28443: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 12755 1727204102.28448: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204102.28454: getting variables 12755 1727204102.28456: in VariableManager get_vars() 12755 1727204102.28831: Calling all_inventory to load vars for managed-node1 12755 1727204102.28835: Calling groups_inventory to load vars for managed-node1 12755 1727204102.28839: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204102.28856: Calling all_plugins_play to load vars for managed-node1 12755 1727204102.28860: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204102.28864: Calling groups_plugins_play to load vars for managed-node1 12755 1727204102.31733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204102.35067: done with get_vars() 12755 1727204102.35122: done getting variables 12755 1727204102.35197: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204102.35347: variable 'profile' from source: include params 12755 1727204102.35351: variable 'item' from source: include params 12755 1727204102.35429: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:55:02 -0400 (0:00:00.093) 0:00:27.590 ***** 12755 1727204102.35470: entering _queue_task() for managed-node1/set_fact 12755 1727204102.35856: worker is 1 (out of 1 available) 12755 1727204102.35983: exiting _queue_task() for managed-node1/set_fact 12755 1727204102.35999: done queuing things up, now waiting for results queue to drain 12755 1727204102.36001: waiting for pending results... 12755 1727204102.36220: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0.0 12755 1727204102.36404: in run() - task 12b410aa-8751-72e9-1a19-000000000635 12755 1727204102.36411: variable 'ansible_search_path' from source: unknown 12755 1727204102.36414: variable 'ansible_search_path' from source: unknown 12755 1727204102.36440: calling self._execute() 12755 1727204102.36597: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204102.36602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204102.36605: variable 'omit' from source: magic vars 12755 1727204102.37036: variable 'ansible_distribution_major_version' from source: facts 12755 1727204102.37047: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204102.37240: variable 'profile_stat' from source: set_fact 12755 1727204102.37259: Evaluated conditional (profile_stat.stat.exists): False 12755 1727204102.37263: when evaluation is False, skipping this task 12755 1727204102.37265: _execute() done 12755 1727204102.37269: dumping result to json 12755 1727204102.37315: done dumping result, returning 12755 1727204102.37324: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [12b410aa-8751-72e9-1a19-000000000635] 12755 1727204102.37328: sending task result for task 12b410aa-8751-72e9-1a19-000000000635 12755 1727204102.37470: done sending task result for task 12b410aa-8751-72e9-1a19-000000000635 12755 1727204102.37474: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12755 1727204102.37564: no more pending results, returning what we have 12755 1727204102.37569: results queue empty 12755 1727204102.37571: checking for any_errors_fatal 12755 1727204102.37578: done checking for any_errors_fatal 12755 1727204102.37579: checking for max_fail_percentage 12755 1727204102.37581: done checking for max_fail_percentage 12755 1727204102.37582: checking to see if all hosts have failed and the running result is not ok 12755 1727204102.37583: done checking to see if all hosts have failed 12755 1727204102.37584: getting the remaining hosts for this loop 12755 1727204102.37586: done getting the remaining hosts for this loop 12755 1727204102.37593: getting the next task for host managed-node1 12755 1727204102.37605: done getting next task for host managed-node1 12755 1727204102.37609: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 12755 1727204102.37613: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204102.37620: getting variables 12755 1727204102.37622: in VariableManager get_vars() 12755 1727204102.38095: Calling all_inventory to load vars for managed-node1 12755 1727204102.38100: Calling groups_inventory to load vars for managed-node1 12755 1727204102.38103: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204102.38116: Calling all_plugins_play to load vars for managed-node1 12755 1727204102.38120: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204102.38125: Calling groups_plugins_play to load vars for managed-node1 12755 1727204102.42564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204102.46542: done with get_vars() 12755 1727204102.46799: done getting variables 12755 1727204102.46878: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204102.47261: variable 'profile' from source: include params 12755 1727204102.47266: variable 'item' from source: include params 12755 1727204102.47462: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:55:02 -0400 (0:00:00.120) 0:00:27.711 ***** 12755 1727204102.47553: entering _queue_task() for managed-node1/assert 12755 1727204102.48287: worker is 1 (out of 1 available) 12755 1727204102.48381: exiting _queue_task() for managed-node1/assert 12755 1727204102.48444: done queuing things up, now waiting for results queue to drain 12755 1727204102.48446: waiting for pending results... 12755 1727204102.48854: running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0.0' 12755 1727204102.48864: in run() - task 12b410aa-8751-72e9-1a19-00000000035d 12755 1727204102.48869: variable 'ansible_search_path' from source: unknown 12755 1727204102.48871: variable 'ansible_search_path' from source: unknown 12755 1727204102.48880: calling self._execute() 12755 1727204102.49011: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204102.49026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204102.49041: variable 'omit' from source: magic vars 12755 1727204102.49539: variable 'ansible_distribution_major_version' from source: facts 12755 1727204102.49597: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204102.49610: variable 'omit' from source: magic vars 12755 1727204102.49649: variable 'omit' from source: magic vars 12755 1727204102.49784: variable 'profile' from source: include params 12755 1727204102.49799: variable 'item' from source: include params 12755 1727204102.49885: variable 'item' from source: include params 12755 1727204102.49938: variable 'omit' from source: magic vars 12755 1727204102.49975: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204102.50047: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204102.50063: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204102.50096: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204102.50155: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204102.50166: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204102.50178: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204102.50186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204102.50330: Set connection var ansible_connection to ssh 12755 1727204102.50337: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204102.50340: Set connection var ansible_shell_type to sh 12755 1727204102.50351: Set connection var ansible_timeout to 10 12755 1727204102.50358: Set connection var ansible_shell_executable to /bin/sh 12755 1727204102.50364: Set connection var ansible_pipelining to False 12755 1727204102.50394: variable 'ansible_shell_executable' from source: unknown 12755 1727204102.50397: variable 'ansible_connection' from source: unknown 12755 1727204102.50403: variable 'ansible_module_compression' from source: unknown 12755 1727204102.50406: variable 'ansible_shell_type' from source: unknown 12755 1727204102.50410: variable 'ansible_shell_executable' from source: unknown 12755 1727204102.50414: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204102.50419: variable 'ansible_pipelining' from source: unknown 12755 1727204102.50425: variable 'ansible_timeout' from source: unknown 12755 1727204102.50430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204102.50574: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204102.50592: variable 'omit' from source: magic vars 12755 1727204102.50597: starting attempt loop 12755 1727204102.50600: running the handler 12755 1727204102.50699: variable 'lsr_net_profile_exists' from source: set_fact 12755 1727204102.50704: Evaluated conditional (lsr_net_profile_exists): True 12755 1727204102.50712: handler run complete 12755 1727204102.50730: attempt loop complete, returning result 12755 1727204102.50733: _execute() done 12755 1727204102.50736: dumping result to json 12755 1727204102.50739: done dumping result, returning 12755 1727204102.50748: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0.0' [12b410aa-8751-72e9-1a19-00000000035d] 12755 1727204102.50754: sending task result for task 12b410aa-8751-72e9-1a19-00000000035d 12755 1727204102.50852: done sending task result for task 12b410aa-8751-72e9-1a19-00000000035d 12755 1727204102.50855: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12755 1727204102.50968: no more pending results, returning what we have 12755 1727204102.50976: results queue empty 12755 1727204102.50978: checking for any_errors_fatal 12755 1727204102.50983: done checking for any_errors_fatal 12755 1727204102.50984: checking for max_fail_percentage 12755 1727204102.50986: done checking for max_fail_percentage 12755 1727204102.50987: checking to see if all hosts have failed and the running result is not ok 12755 1727204102.50988: done checking to see if all hosts have failed 12755 1727204102.50991: getting the remaining hosts for this loop 12755 1727204102.50993: done getting the remaining hosts for this loop 12755 1727204102.50996: getting the next task for host managed-node1 12755 1727204102.51050: done getting next task for host managed-node1 12755 1727204102.51054: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 12755 1727204102.51057: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204102.51062: getting variables 12755 1727204102.51063: in VariableManager get_vars() 12755 1727204102.51139: Calling all_inventory to load vars for managed-node1 12755 1727204102.51143: Calling groups_inventory to load vars for managed-node1 12755 1727204102.51147: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204102.51162: Calling all_plugins_play to load vars for managed-node1 12755 1727204102.51166: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204102.51171: Calling groups_plugins_play to load vars for managed-node1 12755 1727204102.53438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204102.58462: done with get_vars() 12755 1727204102.58517: done getting variables 12755 1727204102.58599: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204102.58741: variable 'profile' from source: include params 12755 1727204102.58747: variable 'item' from source: include params 12755 1727204102.58821: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:55:02 -0400 (0:00:00.113) 0:00:27.824 ***** 12755 1727204102.58864: entering _queue_task() for managed-node1/assert 12755 1727204102.59405: worker is 1 (out of 1 available) 12755 1727204102.59421: exiting _queue_task() for managed-node1/assert 12755 1727204102.59436: done queuing things up, now waiting for results queue to drain 12755 1727204102.59437: waiting for pending results... 12755 1727204102.60234: running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0.0' 12755 1727204102.60241: in run() - task 12b410aa-8751-72e9-1a19-00000000035e 12755 1727204102.60245: variable 'ansible_search_path' from source: unknown 12755 1727204102.60248: variable 'ansible_search_path' from source: unknown 12755 1727204102.60250: calling self._execute() 12755 1727204102.60433: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204102.60437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204102.60442: variable 'omit' from source: magic vars 12755 1727204102.60872: variable 'ansible_distribution_major_version' from source: facts 12755 1727204102.60885: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204102.60894: variable 'omit' from source: magic vars 12755 1727204102.61031: variable 'omit' from source: magic vars 12755 1727204102.61126: variable 'profile' from source: include params 12755 1727204102.61138: variable 'item' from source: include params 12755 1727204102.61208: variable 'item' from source: include params 12755 1727204102.61243: variable 'omit' from source: magic vars 12755 1727204102.61285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204102.61335: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204102.61367: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204102.61382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204102.61401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204102.61443: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204102.61451: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204102.61454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204102.61607: Set connection var ansible_connection to ssh 12755 1727204102.61620: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204102.61628: Set connection var ansible_shell_type to sh 12755 1727204102.61643: Set connection var ansible_timeout to 10 12755 1727204102.61645: Set connection var ansible_shell_executable to /bin/sh 12755 1727204102.61666: Set connection var ansible_pipelining to False 12755 1727204102.61702: variable 'ansible_shell_executable' from source: unknown 12755 1727204102.61714: variable 'ansible_connection' from source: unknown 12755 1727204102.61720: variable 'ansible_module_compression' from source: unknown 12755 1727204102.61723: variable 'ansible_shell_type' from source: unknown 12755 1727204102.61725: variable 'ansible_shell_executable' from source: unknown 12755 1727204102.61732: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204102.61734: variable 'ansible_pipelining' from source: unknown 12755 1727204102.61738: variable 'ansible_timeout' from source: unknown 12755 1727204102.61743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204102.61902: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204102.61912: variable 'omit' from source: magic vars 12755 1727204102.61923: starting attempt loop 12755 1727204102.61926: running the handler 12755 1727204102.62027: variable 'lsr_net_profile_ansible_managed' from source: set_fact 12755 1727204102.62031: Evaluated conditional (lsr_net_profile_ansible_managed): True 12755 1727204102.62034: handler run complete 12755 1727204102.62047: attempt loop complete, returning result 12755 1727204102.62050: _execute() done 12755 1727204102.62055: dumping result to json 12755 1727204102.62058: done dumping result, returning 12755 1727204102.62073: done running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0.0' [12b410aa-8751-72e9-1a19-00000000035e] 12755 1727204102.62076: sending task result for task 12b410aa-8751-72e9-1a19-00000000035e 12755 1727204102.62212: done sending task result for task 12b410aa-8751-72e9-1a19-00000000035e 12755 1727204102.62215: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12755 1727204102.62305: no more pending results, returning what we have 12755 1727204102.62308: results queue empty 12755 1727204102.62310: checking for any_errors_fatal 12755 1727204102.62315: done checking for any_errors_fatal 12755 1727204102.62316: checking for max_fail_percentage 12755 1727204102.62320: done checking for max_fail_percentage 12755 1727204102.62321: checking to see if all hosts have failed and the running result is not ok 12755 1727204102.62322: done checking to see if all hosts have failed 12755 1727204102.62323: getting the remaining hosts for this loop 12755 1727204102.62325: done getting the remaining hosts for this loop 12755 1727204102.62329: getting the next task for host managed-node1 12755 1727204102.62336: done getting next task for host managed-node1 12755 1727204102.62339: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 12755 1727204102.62342: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204102.62346: getting variables 12755 1727204102.62384: in VariableManager get_vars() 12755 1727204102.62486: Calling all_inventory to load vars for managed-node1 12755 1727204102.62493: Calling groups_inventory to load vars for managed-node1 12755 1727204102.62496: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204102.62506: Calling all_plugins_play to load vars for managed-node1 12755 1727204102.62508: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204102.62510: Calling groups_plugins_play to load vars for managed-node1 12755 1727204102.64455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204102.70784: done with get_vars() 12755 1727204102.70814: done getting variables 12755 1727204102.70864: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204102.70964: variable 'profile' from source: include params 12755 1727204102.70968: variable 'item' from source: include params 12755 1727204102.71057: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:55:02 -0400 (0:00:00.122) 0:00:27.946 ***** 12755 1727204102.71086: entering _queue_task() for managed-node1/assert 12755 1727204102.71372: worker is 1 (out of 1 available) 12755 1727204102.71388: exiting _queue_task() for managed-node1/assert 12755 1727204102.71403: done queuing things up, now waiting for results queue to drain 12755 1727204102.71405: waiting for pending results... 12755 1727204102.71610: running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0.0 12755 1727204102.71696: in run() - task 12b410aa-8751-72e9-1a19-00000000035f 12755 1727204102.71708: variable 'ansible_search_path' from source: unknown 12755 1727204102.71713: variable 'ansible_search_path' from source: unknown 12755 1727204102.71759: calling self._execute() 12755 1727204102.71837: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204102.71847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204102.71860: variable 'omit' from source: magic vars 12755 1727204102.72225: variable 'ansible_distribution_major_version' from source: facts 12755 1727204102.72237: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204102.72244: variable 'omit' from source: magic vars 12755 1727204102.72279: variable 'omit' from source: magic vars 12755 1727204102.72372: variable 'profile' from source: include params 12755 1727204102.72376: variable 'item' from source: include params 12755 1727204102.72437: variable 'item' from source: include params 12755 1727204102.72453: variable 'omit' from source: magic vars 12755 1727204102.72492: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204102.72529: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204102.72548: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204102.72564: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204102.72577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204102.72607: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204102.72611: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204102.72614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204102.72704: Set connection var ansible_connection to ssh 12755 1727204102.72711: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204102.72714: Set connection var ansible_shell_type to sh 12755 1727204102.72729: Set connection var ansible_timeout to 10 12755 1727204102.72737: Set connection var ansible_shell_executable to /bin/sh 12755 1727204102.72742: Set connection var ansible_pipelining to False 12755 1727204102.72766: variable 'ansible_shell_executable' from source: unknown 12755 1727204102.72770: variable 'ansible_connection' from source: unknown 12755 1727204102.72772: variable 'ansible_module_compression' from source: unknown 12755 1727204102.72775: variable 'ansible_shell_type' from source: unknown 12755 1727204102.72778: variable 'ansible_shell_executable' from source: unknown 12755 1727204102.72785: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204102.72788: variable 'ansible_pipelining' from source: unknown 12755 1727204102.72793: variable 'ansible_timeout' from source: unknown 12755 1727204102.72799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204102.72923: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204102.72937: variable 'omit' from source: magic vars 12755 1727204102.72952: starting attempt loop 12755 1727204102.72956: running the handler 12755 1727204102.73062: variable 'lsr_net_profile_fingerprint' from source: set_fact 12755 1727204102.73067: Evaluated conditional (lsr_net_profile_fingerprint): True 12755 1727204102.73087: handler run complete 12755 1727204102.73105: attempt loop complete, returning result 12755 1727204102.73109: _execute() done 12755 1727204102.73112: dumping result to json 12755 1727204102.73115: done dumping result, returning 12755 1727204102.73137: done running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0.0 [12b410aa-8751-72e9-1a19-00000000035f] 12755 1727204102.73141: sending task result for task 12b410aa-8751-72e9-1a19-00000000035f 12755 1727204102.73231: done sending task result for task 12b410aa-8751-72e9-1a19-00000000035f 12755 1727204102.73234: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12755 1727204102.73305: no more pending results, returning what we have 12755 1727204102.73308: results queue empty 12755 1727204102.73310: checking for any_errors_fatal 12755 1727204102.73319: done checking for any_errors_fatal 12755 1727204102.73320: checking for max_fail_percentage 12755 1727204102.73321: done checking for max_fail_percentage 12755 1727204102.73322: checking to see if all hosts have failed and the running result is not ok 12755 1727204102.73324: done checking to see if all hosts have failed 12755 1727204102.73324: getting the remaining hosts for this loop 12755 1727204102.73326: done getting the remaining hosts for this loop 12755 1727204102.73331: getting the next task for host managed-node1 12755 1727204102.73341: done getting next task for host managed-node1 12755 1727204102.73344: ^ task is: TASK: Include the task 'get_profile_stat.yml' 12755 1727204102.73347: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204102.73352: getting variables 12755 1727204102.73353: in VariableManager get_vars() 12755 1727204102.73456: Calling all_inventory to load vars for managed-node1 12755 1727204102.73460: Calling groups_inventory to load vars for managed-node1 12755 1727204102.73463: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204102.73475: Calling all_plugins_play to load vars for managed-node1 12755 1727204102.73478: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204102.73481: Calling groups_plugins_play to load vars for managed-node1 12755 1727204102.74863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204102.76947: done with get_vars() 12755 1727204102.77005: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:55:02 -0400 (0:00:00.060) 0:00:28.007 ***** 12755 1727204102.77129: entering _queue_task() for managed-node1/include_tasks 12755 1727204102.77476: worker is 1 (out of 1 available) 12755 1727204102.77493: exiting _queue_task() for managed-node1/include_tasks 12755 1727204102.77510: done queuing things up, now waiting for results queue to drain 12755 1727204102.77512: waiting for pending results... 12755 1727204102.77772: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 12755 1727204102.77895: in run() - task 12b410aa-8751-72e9-1a19-000000000363 12755 1727204102.77914: variable 'ansible_search_path' from source: unknown 12755 1727204102.77920: variable 'ansible_search_path' from source: unknown 12755 1727204102.77966: calling self._execute() 12755 1727204102.78075: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204102.78084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204102.78097: variable 'omit' from source: magic vars 12755 1727204102.78587: variable 'ansible_distribution_major_version' from source: facts 12755 1727204102.78604: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204102.78614: _execute() done 12755 1727204102.78617: dumping result to json 12755 1727204102.78620: done dumping result, returning 12755 1727204102.78628: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [12b410aa-8751-72e9-1a19-000000000363] 12755 1727204102.78634: sending task result for task 12b410aa-8751-72e9-1a19-000000000363 12755 1727204102.78738: done sending task result for task 12b410aa-8751-72e9-1a19-000000000363 12755 1727204102.78742: WORKER PROCESS EXITING 12755 1727204102.78851: no more pending results, returning what we have 12755 1727204102.78871: in VariableManager get_vars() 12755 1727204102.78934: Calling all_inventory to load vars for managed-node1 12755 1727204102.78937: Calling groups_inventory to load vars for managed-node1 12755 1727204102.78940: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204102.78953: Calling all_plugins_play to load vars for managed-node1 12755 1727204102.78956: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204102.78959: Calling groups_plugins_play to load vars for managed-node1 12755 1727204102.80666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204102.83076: done with get_vars() 12755 1727204102.83107: variable 'ansible_search_path' from source: unknown 12755 1727204102.83109: variable 'ansible_search_path' from source: unknown 12755 1727204102.83145: we have included files to process 12755 1727204102.83146: generating all_blocks data 12755 1727204102.83150: done generating all_blocks data 12755 1727204102.83162: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12755 1727204102.83164: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12755 1727204102.83168: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12755 1727204102.84108: done processing included file 12755 1727204102.84110: iterating over new_blocks loaded from include file 12755 1727204102.84111: in VariableManager get_vars() 12755 1727204102.84139: done with get_vars() 12755 1727204102.84141: filtering new block on tags 12755 1727204102.84162: done filtering new block on tags 12755 1727204102.84164: in VariableManager get_vars() 12755 1727204102.84183: done with get_vars() 12755 1727204102.84186: filtering new block on tags 12755 1727204102.84209: done filtering new block on tags 12755 1727204102.84210: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 12755 1727204102.84215: extending task lists for all hosts with included blocks 12755 1727204102.84365: done extending task lists 12755 1727204102.84367: done processing included files 12755 1727204102.84367: results queue empty 12755 1727204102.84368: checking for any_errors_fatal 12755 1727204102.84371: done checking for any_errors_fatal 12755 1727204102.84371: checking for max_fail_percentage 12755 1727204102.84372: done checking for max_fail_percentage 12755 1727204102.84373: checking to see if all hosts have failed and the running result is not ok 12755 1727204102.84374: done checking to see if all hosts have failed 12755 1727204102.84374: getting the remaining hosts for this loop 12755 1727204102.84375: done getting the remaining hosts for this loop 12755 1727204102.84377: getting the next task for host managed-node1 12755 1727204102.84380: done getting next task for host managed-node1 12755 1727204102.84382: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 12755 1727204102.84384: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204102.84386: getting variables 12755 1727204102.84387: in VariableManager get_vars() 12755 1727204102.84407: Calling all_inventory to load vars for managed-node1 12755 1727204102.84410: Calling groups_inventory to load vars for managed-node1 12755 1727204102.84412: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204102.84421: Calling all_plugins_play to load vars for managed-node1 12755 1727204102.84423: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204102.84426: Calling groups_plugins_play to load vars for managed-node1 12755 1727204102.85758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204102.87805: done with get_vars() 12755 1727204102.87846: done getting variables 12755 1727204102.87891: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:55:02 -0400 (0:00:00.108) 0:00:28.115 ***** 12755 1727204102.87935: entering _queue_task() for managed-node1/set_fact 12755 1727204102.88234: worker is 1 (out of 1 available) 12755 1727204102.88250: exiting _queue_task() for managed-node1/set_fact 12755 1727204102.88267: done queuing things up, now waiting for results queue to drain 12755 1727204102.88268: waiting for pending results... 12755 1727204102.88765: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 12755 1727204102.88772: in run() - task 12b410aa-8751-72e9-1a19-000000000674 12755 1727204102.88776: variable 'ansible_search_path' from source: unknown 12755 1727204102.88778: variable 'ansible_search_path' from source: unknown 12755 1727204102.88898: calling self._execute() 12755 1727204102.88940: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204102.88955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204102.88966: variable 'omit' from source: magic vars 12755 1727204102.89463: variable 'ansible_distribution_major_version' from source: facts 12755 1727204102.89484: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204102.89497: variable 'omit' from source: magic vars 12755 1727204102.89588: variable 'omit' from source: magic vars 12755 1727204102.89605: variable 'omit' from source: magic vars 12755 1727204102.89660: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204102.89719: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204102.89912: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204102.89916: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204102.89919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204102.89922: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204102.89925: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204102.89928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204102.89969: Set connection var ansible_connection to ssh 12755 1727204102.89977: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204102.89980: Set connection var ansible_shell_type to sh 12755 1727204102.89997: Set connection var ansible_timeout to 10 12755 1727204102.90009: Set connection var ansible_shell_executable to /bin/sh 12755 1727204102.90020: Set connection var ansible_pipelining to False 12755 1727204102.90048: variable 'ansible_shell_executable' from source: unknown 12755 1727204102.90052: variable 'ansible_connection' from source: unknown 12755 1727204102.90056: variable 'ansible_module_compression' from source: unknown 12755 1727204102.90058: variable 'ansible_shell_type' from source: unknown 12755 1727204102.90061: variable 'ansible_shell_executable' from source: unknown 12755 1727204102.90066: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204102.90072: variable 'ansible_pipelining' from source: unknown 12755 1727204102.90075: variable 'ansible_timeout' from source: unknown 12755 1727204102.90082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204102.90271: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204102.90282: variable 'omit' from source: magic vars 12755 1727204102.90287: starting attempt loop 12755 1727204102.90292: running the handler 12755 1727204102.90310: handler run complete 12755 1727204102.90323: attempt loop complete, returning result 12755 1727204102.90326: _execute() done 12755 1727204102.90347: dumping result to json 12755 1727204102.90353: done dumping result, returning 12755 1727204102.90356: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [12b410aa-8751-72e9-1a19-000000000674] 12755 1727204102.90359: sending task result for task 12b410aa-8751-72e9-1a19-000000000674 12755 1727204102.90522: done sending task result for task 12b410aa-8751-72e9-1a19-000000000674 12755 1727204102.90526: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 12755 1727204102.90603: no more pending results, returning what we have 12755 1727204102.90608: results queue empty 12755 1727204102.90610: checking for any_errors_fatal 12755 1727204102.90612: done checking for any_errors_fatal 12755 1727204102.90613: checking for max_fail_percentage 12755 1727204102.90615: done checking for max_fail_percentage 12755 1727204102.90616: checking to see if all hosts have failed and the running result is not ok 12755 1727204102.90619: done checking to see if all hosts have failed 12755 1727204102.90620: getting the remaining hosts for this loop 12755 1727204102.90622: done getting the remaining hosts for this loop 12755 1727204102.90628: getting the next task for host managed-node1 12755 1727204102.90638: done getting next task for host managed-node1 12755 1727204102.90641: ^ task is: TASK: Stat profile file 12755 1727204102.90646: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204102.90650: getting variables 12755 1727204102.90653: in VariableManager get_vars() 12755 1727204102.90940: Calling all_inventory to load vars for managed-node1 12755 1727204102.90944: Calling groups_inventory to load vars for managed-node1 12755 1727204102.90947: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204102.90959: Calling all_plugins_play to load vars for managed-node1 12755 1727204102.90963: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204102.90967: Calling groups_plugins_play to load vars for managed-node1 12755 1727204102.92721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204102.94553: done with get_vars() 12755 1727204102.94599: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:55:02 -0400 (0:00:00.067) 0:00:28.182 ***** 12755 1727204102.94724: entering _queue_task() for managed-node1/stat 12755 1727204102.95109: worker is 1 (out of 1 available) 12755 1727204102.95130: exiting _queue_task() for managed-node1/stat 12755 1727204102.95146: done queuing things up, now waiting for results queue to drain 12755 1727204102.95149: waiting for pending results... 12755 1727204102.95706: running TaskExecutor() for managed-node1/TASK: Stat profile file 12755 1727204102.95853: in run() - task 12b410aa-8751-72e9-1a19-000000000675 12755 1727204102.95857: variable 'ansible_search_path' from source: unknown 12755 1727204102.95861: variable 'ansible_search_path' from source: unknown 12755 1727204102.95885: calling self._execute() 12755 1727204102.96026: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204102.96084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204102.96088: variable 'omit' from source: magic vars 12755 1727204102.96917: variable 'ansible_distribution_major_version' from source: facts 12755 1727204102.96936: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204102.96939: variable 'omit' from source: magic vars 12755 1727204102.96942: variable 'omit' from source: magic vars 12755 1727204102.97166: variable 'profile' from source: include params 12755 1727204102.97177: variable 'item' from source: include params 12755 1727204102.97375: variable 'item' from source: include params 12755 1727204102.97407: variable 'omit' from source: magic vars 12755 1727204102.97475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204102.97709: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204102.97743: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204102.97768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204102.97808: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204102.97958: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204102.97961: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204102.97963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204102.98098: Set connection var ansible_connection to ssh 12755 1727204102.98102: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204102.98105: Set connection var ansible_shell_type to sh 12755 1727204102.98119: Set connection var ansible_timeout to 10 12755 1727204102.98134: Set connection var ansible_shell_executable to /bin/sh 12755 1727204102.98142: Set connection var ansible_pipelining to False 12755 1727204102.98171: variable 'ansible_shell_executable' from source: unknown 12755 1727204102.98175: variable 'ansible_connection' from source: unknown 12755 1727204102.98178: variable 'ansible_module_compression' from source: unknown 12755 1727204102.98181: variable 'ansible_shell_type' from source: unknown 12755 1727204102.98186: variable 'ansible_shell_executable' from source: unknown 12755 1727204102.98192: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204102.98197: variable 'ansible_pipelining' from source: unknown 12755 1727204102.98206: variable 'ansible_timeout' from source: unknown 12755 1727204102.98209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204102.98488: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204102.98522: variable 'omit' from source: magic vars 12755 1727204102.98526: starting attempt loop 12755 1727204102.98529: running the handler 12755 1727204102.98536: _low_level_execute_command(): starting 12755 1727204102.98545: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204102.99421: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204102.99453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204102.99538: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204102.99542: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204102.99577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204102.99656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204103.01506: stdout chunk (state=3): >>>/root <<< 12755 1727204103.01813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204103.01819: stdout chunk (state=3): >>><<< 12755 1727204103.01825: stderr chunk (state=3): >>><<< 12755 1727204103.01993: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204103.01998: _low_level_execute_command(): starting 12755 1727204103.02001: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204103.018698-14555-30457017723392 `" && echo ansible-tmp-1727204103.018698-14555-30457017723392="` echo /root/.ansible/tmp/ansible-tmp-1727204103.018698-14555-30457017723392 `" ) && sleep 0' 12755 1727204103.02634: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204103.02650: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204103.02784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204103.02824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204103.02914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204103.05136: stdout chunk (state=3): >>>ansible-tmp-1727204103.018698-14555-30457017723392=/root/.ansible/tmp/ansible-tmp-1727204103.018698-14555-30457017723392 <<< 12755 1727204103.05301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204103.05329: stdout chunk (state=3): >>><<< 12755 1727204103.05332: stderr chunk (state=3): >>><<< 12755 1727204103.05354: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204103.018698-14555-30457017723392=/root/.ansible/tmp/ansible-tmp-1727204103.018698-14555-30457017723392 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204103.05495: variable 'ansible_module_compression' from source: unknown 12755 1727204103.05498: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12755 1727204103.05540: variable 'ansible_facts' from source: unknown 12755 1727204103.05644: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204103.018698-14555-30457017723392/AnsiballZ_stat.py 12755 1727204103.05815: Sending initial data 12755 1727204103.05951: Sent initial data (151 bytes) 12755 1727204103.06543: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204103.06562: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204103.06581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204103.06701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 12755 1727204103.06720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204103.06744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204103.06861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204103.08639: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204103.08726: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204103.018698-14555-30457017723392/AnsiballZ_stat.py" <<< 12755 1727204103.08750: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmp6x4ilhqh /root/.ansible/tmp/ansible-tmp-1727204103.018698-14555-30457017723392/AnsiballZ_stat.py <<< 12755 1727204103.08809: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmp6x4ilhqh" to remote "/root/.ansible/tmp/ansible-tmp-1727204103.018698-14555-30457017723392/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204103.018698-14555-30457017723392/AnsiballZ_stat.py" <<< 12755 1727204103.10525: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204103.10607: stderr chunk (state=3): >>><<< 12755 1727204103.10628: stdout chunk (state=3): >>><<< 12755 1727204103.10668: done transferring module to remote 12755 1727204103.10687: _low_level_execute_command(): starting 12755 1727204103.10744: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204103.018698-14555-30457017723392/ /root/.ansible/tmp/ansible-tmp-1727204103.018698-14555-30457017723392/AnsiballZ_stat.py && sleep 0' 12755 1727204103.11310: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204103.11340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204103.11344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204103.11397: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204103.11403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204103.11422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204103.11463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204103.13697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204103.13702: stdout chunk (state=3): >>><<< 12755 1727204103.13705: stderr chunk (state=3): >>><<< 12755 1727204103.13732: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204103.13766: _low_level_execute_command(): starting 12755 1727204103.13769: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204103.018698-14555-30457017723392/AnsiballZ_stat.py && sleep 0' 12755 1727204103.14911: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204103.15265: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 12755 1727204103.15272: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204103.15275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204103.15500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204103.33188: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12755 1727204103.34898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204103.34905: stdout chunk (state=3): >>><<< 12755 1727204103.34907: stderr chunk (state=3): >>><<< 12755 1727204103.34910: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204103.34913: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204103.018698-14555-30457017723392/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204103.34926: _low_level_execute_command(): starting 12755 1727204103.34932: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204103.018698-14555-30457017723392/ > /dev/null 2>&1 && sleep 0' 12755 1727204103.35606: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204103.35697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204103.35701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204103.35704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204103.35707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204103.35709: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204103.35712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204103.35714: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12755 1727204103.35716: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 12755 1727204103.35719: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12755 1727204103.35721: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204103.35723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204103.35736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204103.35744: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204103.35752: stderr chunk (state=3): >>>debug2: match found <<< 12755 1727204103.35762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204103.35847: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204103.35856: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204103.35898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204103.35947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204103.38120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204103.38138: stderr chunk (state=3): >>><<< 12755 1727204103.38148: stdout chunk (state=3): >>><<< 12755 1727204103.38173: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204103.38192: handler run complete 12755 1727204103.38229: attempt loop complete, returning result 12755 1727204103.38241: _execute() done 12755 1727204103.38249: dumping result to json 12755 1727204103.38258: done dumping result, returning 12755 1727204103.38296: done running TaskExecutor() for managed-node1/TASK: Stat profile file [12b410aa-8751-72e9-1a19-000000000675] 12755 1727204103.38299: sending task result for task 12b410aa-8751-72e9-1a19-000000000675 ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 12755 1727204103.38499: no more pending results, returning what we have 12755 1727204103.38504: results queue empty 12755 1727204103.38505: checking for any_errors_fatal 12755 1727204103.38514: done checking for any_errors_fatal 12755 1727204103.38515: checking for max_fail_percentage 12755 1727204103.38516: done checking for max_fail_percentage 12755 1727204103.38519: checking to see if all hosts have failed and the running result is not ok 12755 1727204103.38520: done checking to see if all hosts have failed 12755 1727204103.38521: getting the remaining hosts for this loop 12755 1727204103.38523: done getting the remaining hosts for this loop 12755 1727204103.38529: getting the next task for host managed-node1 12755 1727204103.38537: done getting next task for host managed-node1 12755 1727204103.38542: ^ task is: TASK: Set NM profile exist flag based on the profile files 12755 1727204103.38545: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204103.38550: getting variables 12755 1727204103.38552: in VariableManager get_vars() 12755 1727204103.38792: Calling all_inventory to load vars for managed-node1 12755 1727204103.38796: Calling groups_inventory to load vars for managed-node1 12755 1727204103.38799: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204103.38807: done sending task result for task 12b410aa-8751-72e9-1a19-000000000675 12755 1727204103.38810: WORKER PROCESS EXITING 12755 1727204103.38821: Calling all_plugins_play to load vars for managed-node1 12755 1727204103.38825: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204103.38830: Calling groups_plugins_play to load vars for managed-node1 12755 1727204103.40253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204103.42260: done with get_vars() 12755 1727204103.42305: done getting variables 12755 1727204103.42388: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:55:03 -0400 (0:00:00.477) 0:00:28.660 ***** 12755 1727204103.42429: entering _queue_task() for managed-node1/set_fact 12755 1727204103.42775: worker is 1 (out of 1 available) 12755 1727204103.42984: exiting _queue_task() for managed-node1/set_fact 12755 1727204103.43000: done queuing things up, now waiting for results queue to drain 12755 1727204103.43002: waiting for pending results... 12755 1727204103.43133: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 12755 1727204103.43343: in run() - task 12b410aa-8751-72e9-1a19-000000000676 12755 1727204103.43347: variable 'ansible_search_path' from source: unknown 12755 1727204103.43351: variable 'ansible_search_path' from source: unknown 12755 1727204103.43355: calling self._execute() 12755 1727204103.43482: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204103.43492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204103.43505: variable 'omit' from source: magic vars 12755 1727204103.44010: variable 'ansible_distribution_major_version' from source: facts 12755 1727204103.44031: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204103.44140: variable 'profile_stat' from source: set_fact 12755 1727204103.44151: Evaluated conditional (profile_stat.stat.exists): False 12755 1727204103.44155: when evaluation is False, skipping this task 12755 1727204103.44158: _execute() done 12755 1727204103.44163: dumping result to json 12755 1727204103.44166: done dumping result, returning 12755 1727204103.44174: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [12b410aa-8751-72e9-1a19-000000000676] 12755 1727204103.44180: sending task result for task 12b410aa-8751-72e9-1a19-000000000676 12755 1727204103.44283: done sending task result for task 12b410aa-8751-72e9-1a19-000000000676 12755 1727204103.44287: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12755 1727204103.44342: no more pending results, returning what we have 12755 1727204103.44346: results queue empty 12755 1727204103.44347: checking for any_errors_fatal 12755 1727204103.44358: done checking for any_errors_fatal 12755 1727204103.44358: checking for max_fail_percentage 12755 1727204103.44360: done checking for max_fail_percentage 12755 1727204103.44361: checking to see if all hosts have failed and the running result is not ok 12755 1727204103.44362: done checking to see if all hosts have failed 12755 1727204103.44363: getting the remaining hosts for this loop 12755 1727204103.44365: done getting the remaining hosts for this loop 12755 1727204103.44370: getting the next task for host managed-node1 12755 1727204103.44377: done getting next task for host managed-node1 12755 1727204103.44379: ^ task is: TASK: Get NM profile info 12755 1727204103.44383: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204103.44388: getting variables 12755 1727204103.44391: in VariableManager get_vars() 12755 1727204103.44451: Calling all_inventory to load vars for managed-node1 12755 1727204103.44455: Calling groups_inventory to load vars for managed-node1 12755 1727204103.44457: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204103.44471: Calling all_plugins_play to load vars for managed-node1 12755 1727204103.44474: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204103.44478: Calling groups_plugins_play to load vars for managed-node1 12755 1727204103.45857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204103.48055: done with get_vars() 12755 1727204103.48086: done getting variables 12755 1727204103.48149: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:55:03 -0400 (0:00:00.057) 0:00:28.717 ***** 12755 1727204103.48177: entering _queue_task() for managed-node1/shell 12755 1727204103.48458: worker is 1 (out of 1 available) 12755 1727204103.48475: exiting _queue_task() for managed-node1/shell 12755 1727204103.48491: done queuing things up, now waiting for results queue to drain 12755 1727204103.48493: waiting for pending results... 12755 1727204103.48703: running TaskExecutor() for managed-node1/TASK: Get NM profile info 12755 1727204103.48798: in run() - task 12b410aa-8751-72e9-1a19-000000000677 12755 1727204103.48811: variable 'ansible_search_path' from source: unknown 12755 1727204103.48815: variable 'ansible_search_path' from source: unknown 12755 1727204103.48854: calling self._execute() 12755 1727204103.48942: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204103.48949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204103.48960: variable 'omit' from source: magic vars 12755 1727204103.49312: variable 'ansible_distribution_major_version' from source: facts 12755 1727204103.49318: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204103.49330: variable 'omit' from source: magic vars 12755 1727204103.49371: variable 'omit' from source: magic vars 12755 1727204103.49462: variable 'profile' from source: include params 12755 1727204103.49466: variable 'item' from source: include params 12755 1727204103.49529: variable 'item' from source: include params 12755 1727204103.49546: variable 'omit' from source: magic vars 12755 1727204103.49585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204103.49621: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204103.49645: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204103.49662: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204103.49674: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204103.49709: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204103.49712: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204103.49715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204103.49802: Set connection var ansible_connection to ssh 12755 1727204103.49810: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204103.49813: Set connection var ansible_shell_type to sh 12755 1727204103.49828: Set connection var ansible_timeout to 10 12755 1727204103.49834: Set connection var ansible_shell_executable to /bin/sh 12755 1727204103.49841: Set connection var ansible_pipelining to False 12755 1727204103.49865: variable 'ansible_shell_executable' from source: unknown 12755 1727204103.49868: variable 'ansible_connection' from source: unknown 12755 1727204103.49871: variable 'ansible_module_compression' from source: unknown 12755 1727204103.49873: variable 'ansible_shell_type' from source: unknown 12755 1727204103.49878: variable 'ansible_shell_executable' from source: unknown 12755 1727204103.49884: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204103.49890: variable 'ansible_pipelining' from source: unknown 12755 1727204103.49893: variable 'ansible_timeout' from source: unknown 12755 1727204103.49899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204103.50027: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204103.50039: variable 'omit' from source: magic vars 12755 1727204103.50043: starting attempt loop 12755 1727204103.50046: running the handler 12755 1727204103.50057: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204103.50078: _low_level_execute_command(): starting 12755 1727204103.50085: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204103.50665: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204103.50672: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204103.50676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204103.50733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204103.50736: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204103.50738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204103.50785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204103.52686: stdout chunk (state=3): >>>/root <<< 12755 1727204103.52783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204103.52861: stderr chunk (state=3): >>><<< 12755 1727204103.52864: stdout chunk (state=3): >>><<< 12755 1727204103.52873: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204103.52895: _low_level_execute_command(): starting 12755 1727204103.52907: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204103.5287538-14579-193427676313215 `" && echo ansible-tmp-1727204103.5287538-14579-193427676313215="` echo /root/.ansible/tmp/ansible-tmp-1727204103.5287538-14579-193427676313215 `" ) && sleep 0' 12755 1727204103.53597: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204103.53616: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204103.53636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204103.53811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204103.55903: stdout chunk (state=3): >>>ansible-tmp-1727204103.5287538-14579-193427676313215=/root/.ansible/tmp/ansible-tmp-1727204103.5287538-14579-193427676313215 <<< 12755 1727204103.56029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204103.56094: stderr chunk (state=3): >>><<< 12755 1727204103.56098: stdout chunk (state=3): >>><<< 12755 1727204103.56119: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204103.5287538-14579-193427676313215=/root/.ansible/tmp/ansible-tmp-1727204103.5287538-14579-193427676313215 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204103.56209: variable 'ansible_module_compression' from source: unknown 12755 1727204103.56273: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12755 1727204103.56295: variable 'ansible_facts' from source: unknown 12755 1727204103.56497: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204103.5287538-14579-193427676313215/AnsiballZ_command.py 12755 1727204103.56569: Sending initial data 12755 1727204103.56580: Sent initial data (156 bytes) 12755 1727204103.57199: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204103.57225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204103.57315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204103.57364: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204103.59131: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204103.59173: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204103.59213: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmp_3xfv7wn /root/.ansible/tmp/ansible-tmp-1727204103.5287538-14579-193427676313215/AnsiballZ_command.py <<< 12755 1727204103.59221: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204103.5287538-14579-193427676313215/AnsiballZ_command.py" <<< 12755 1727204103.59301: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmp_3xfv7wn" to remote "/root/.ansible/tmp/ansible-tmp-1727204103.5287538-14579-193427676313215/AnsiballZ_command.py" <<< 12755 1727204103.59306: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204103.5287538-14579-193427676313215/AnsiballZ_command.py" <<< 12755 1727204103.60216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204103.60293: stderr chunk (state=3): >>><<< 12755 1727204103.60297: stdout chunk (state=3): >>><<< 12755 1727204103.60321: done transferring module to remote 12755 1727204103.60335: _low_level_execute_command(): starting 12755 1727204103.60341: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204103.5287538-14579-193427676313215/ /root/.ansible/tmp/ansible-tmp-1727204103.5287538-14579-193427676313215/AnsiballZ_command.py && sleep 0' 12755 1727204103.60851: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204103.60855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204103.60858: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204103.60860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204103.60862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204103.60917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204103.60924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204103.60926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204103.60969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204103.63101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204103.63153: stderr chunk (state=3): >>><<< 12755 1727204103.63171: stdout chunk (state=3): >>><<< 12755 1727204103.63204: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204103.63248: _low_level_execute_command(): starting 12755 1727204103.63252: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204103.5287538-14579-193427676313215/AnsiballZ_command.py && sleep 0' 12755 1727204103.63988: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204103.64013: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204103.64110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204103.64161: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204103.64180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204103.64213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204103.64368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204103.84398: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-24 14:55:03.819734", "end": "2024-09-24 14:55:03.842702", "delta": "0:00:00.022968", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12755 1727204103.86596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204103.86601: stdout chunk (state=3): >>><<< 12755 1727204103.86604: stderr chunk (state=3): >>><<< 12755 1727204103.86607: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-24 14:55:03.819734", "end": "2024-09-24 14:55:03.842702", "delta": "0:00:00.022968", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204103.86609: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204103.5287538-14579-193427676313215/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204103.86612: _low_level_execute_command(): starting 12755 1727204103.86614: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204103.5287538-14579-193427676313215/ > /dev/null 2>&1 && sleep 0' 12755 1727204103.87358: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204103.87362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204103.87392: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204103.87396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204103.87457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204103.87466: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204103.87632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204103.89588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204103.89655: stderr chunk (state=3): >>><<< 12755 1727204103.89659: stdout chunk (state=3): >>><<< 12755 1727204103.89676: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204103.89683: handler run complete 12755 1727204103.89709: Evaluated conditional (False): False 12755 1727204103.89726: attempt loop complete, returning result 12755 1727204103.89729: _execute() done 12755 1727204103.89732: dumping result to json 12755 1727204103.89738: done dumping result, returning 12755 1727204103.89747: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [12b410aa-8751-72e9-1a19-000000000677] 12755 1727204103.89752: sending task result for task 12b410aa-8751-72e9-1a19-000000000677 12755 1727204103.89860: done sending task result for task 12b410aa-8751-72e9-1a19-000000000677 12755 1727204103.89863: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.022968", "end": "2024-09-24 14:55:03.842702", "rc": 0, "start": "2024-09-24 14:55:03.819734" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 12755 1727204103.89951: no more pending results, returning what we have 12755 1727204103.89955: results queue empty 12755 1727204103.89962: checking for any_errors_fatal 12755 1727204103.89969: done checking for any_errors_fatal 12755 1727204103.89970: checking for max_fail_percentage 12755 1727204103.89974: done checking for max_fail_percentage 12755 1727204103.89974: checking to see if all hosts have failed and the running result is not ok 12755 1727204103.89976: done checking to see if all hosts have failed 12755 1727204103.89977: getting the remaining hosts for this loop 12755 1727204103.89978: done getting the remaining hosts for this loop 12755 1727204103.89983: getting the next task for host managed-node1 12755 1727204103.89992: done getting next task for host managed-node1 12755 1727204103.89995: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12755 1727204103.89999: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204103.90003: getting variables 12755 1727204103.90005: in VariableManager get_vars() 12755 1727204103.90058: Calling all_inventory to load vars for managed-node1 12755 1727204103.90061: Calling groups_inventory to load vars for managed-node1 12755 1727204103.90064: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204103.90076: Calling all_plugins_play to load vars for managed-node1 12755 1727204103.90079: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204103.90083: Calling groups_plugins_play to load vars for managed-node1 12755 1727204103.92745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204103.98320: done with get_vars() 12755 1727204103.98372: done getting variables 12755 1727204103.98461: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:55:03 -0400 (0:00:00.503) 0:00:29.220 ***** 12755 1727204103.98504: entering _queue_task() for managed-node1/set_fact 12755 1727204103.98927: worker is 1 (out of 1 available) 12755 1727204103.98941: exiting _queue_task() for managed-node1/set_fact 12755 1727204103.99199: done queuing things up, now waiting for results queue to drain 12755 1727204103.99201: waiting for pending results... 12755 1727204103.99410: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12755 1727204103.99446: in run() - task 12b410aa-8751-72e9-1a19-000000000678 12755 1727204103.99469: variable 'ansible_search_path' from source: unknown 12755 1727204103.99478: variable 'ansible_search_path' from source: unknown 12755 1727204103.99613: calling self._execute() 12755 1727204103.99666: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204103.99681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204103.99701: variable 'omit' from source: magic vars 12755 1727204104.00630: variable 'ansible_distribution_major_version' from source: facts 12755 1727204104.00644: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204104.01097: variable 'nm_profile_exists' from source: set_fact 12755 1727204104.01102: Evaluated conditional (nm_profile_exists.rc == 0): True 12755 1727204104.01105: variable 'omit' from source: magic vars 12755 1727204104.01255: variable 'omit' from source: magic vars 12755 1727204104.01363: variable 'omit' from source: magic vars 12755 1727204104.01535: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204104.01647: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204104.01680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204104.01747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204104.01973: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204104.01976: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204104.01979: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204104.01980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204104.02144: Set connection var ansible_connection to ssh 12755 1727204104.02493: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204104.02498: Set connection var ansible_shell_type to sh 12755 1727204104.02501: Set connection var ansible_timeout to 10 12755 1727204104.02504: Set connection var ansible_shell_executable to /bin/sh 12755 1727204104.02506: Set connection var ansible_pipelining to False 12755 1727204104.02508: variable 'ansible_shell_executable' from source: unknown 12755 1727204104.02511: variable 'ansible_connection' from source: unknown 12755 1727204104.02513: variable 'ansible_module_compression' from source: unknown 12755 1727204104.02515: variable 'ansible_shell_type' from source: unknown 12755 1727204104.02519: variable 'ansible_shell_executable' from source: unknown 12755 1727204104.02521: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204104.02524: variable 'ansible_pipelining' from source: unknown 12755 1727204104.02598: variable 'ansible_timeout' from source: unknown 12755 1727204104.02603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204104.02965: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204104.02969: variable 'omit' from source: magic vars 12755 1727204104.02971: starting attempt loop 12755 1727204104.02973: running the handler 12755 1727204104.03074: handler run complete 12755 1727204104.03077: attempt loop complete, returning result 12755 1727204104.03080: _execute() done 12755 1727204104.03082: dumping result to json 12755 1727204104.03087: done dumping result, returning 12755 1727204104.03105: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12b410aa-8751-72e9-1a19-000000000678] 12755 1727204104.03183: sending task result for task 12b410aa-8751-72e9-1a19-000000000678 ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 12755 1727204104.03440: no more pending results, returning what we have 12755 1727204104.03444: results queue empty 12755 1727204104.03445: checking for any_errors_fatal 12755 1727204104.03456: done checking for any_errors_fatal 12755 1727204104.03457: checking for max_fail_percentage 12755 1727204104.03459: done checking for max_fail_percentage 12755 1727204104.03460: checking to see if all hosts have failed and the running result is not ok 12755 1727204104.03461: done checking to see if all hosts have failed 12755 1727204104.03462: getting the remaining hosts for this loop 12755 1727204104.03464: done getting the remaining hosts for this loop 12755 1727204104.03586: getting the next task for host managed-node1 12755 1727204104.03602: done getting next task for host managed-node1 12755 1727204104.03606: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 12755 1727204104.03611: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204104.03619: getting variables 12755 1727204104.03622: in VariableManager get_vars() 12755 1727204104.04007: Calling all_inventory to load vars for managed-node1 12755 1727204104.04012: Calling groups_inventory to load vars for managed-node1 12755 1727204104.04016: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204104.04048: Calling all_plugins_play to load vars for managed-node1 12755 1727204104.04053: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204104.04058: Calling groups_plugins_play to load vars for managed-node1 12755 1727204104.04739: done sending task result for task 12b410aa-8751-72e9-1a19-000000000678 12755 1727204104.04744: WORKER PROCESS EXITING 12755 1727204104.09626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204104.16411: done with get_vars() 12755 1727204104.16578: done getting variables 12755 1727204104.16837: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204104.17155: variable 'profile' from source: include params 12755 1727204104.17160: variable 'item' from source: include params 12755 1727204104.17358: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:55:04 -0400 (0:00:00.189) 0:00:29.409 ***** 12755 1727204104.17409: entering _queue_task() for managed-node1/command 12755 1727204104.18361: worker is 1 (out of 1 available) 12755 1727204104.18375: exiting _queue_task() for managed-node1/command 12755 1727204104.18388: done queuing things up, now waiting for results queue to drain 12755 1727204104.18391: waiting for pending results... 12755 1727204104.18984: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0.1 12755 1727204104.19315: in run() - task 12b410aa-8751-72e9-1a19-00000000067a 12755 1727204104.19332: variable 'ansible_search_path' from source: unknown 12755 1727204104.19336: variable 'ansible_search_path' from source: unknown 12755 1727204104.19374: calling self._execute() 12755 1727204104.19482: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204104.19490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204104.19667: variable 'omit' from source: magic vars 12755 1727204104.20798: variable 'ansible_distribution_major_version' from source: facts 12755 1727204104.20804: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204104.20808: variable 'profile_stat' from source: set_fact 12755 1727204104.20811: Evaluated conditional (profile_stat.stat.exists): False 12755 1727204104.20814: when evaluation is False, skipping this task 12755 1727204104.20816: _execute() done 12755 1727204104.20822: dumping result to json 12755 1727204104.20824: done dumping result, returning 12755 1727204104.20827: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [12b410aa-8751-72e9-1a19-00000000067a] 12755 1727204104.20830: sending task result for task 12b410aa-8751-72e9-1a19-00000000067a 12755 1727204104.20907: done sending task result for task 12b410aa-8751-72e9-1a19-00000000067a skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12755 1727204104.20970: no more pending results, returning what we have 12755 1727204104.20975: results queue empty 12755 1727204104.20976: checking for any_errors_fatal 12755 1727204104.20984: done checking for any_errors_fatal 12755 1727204104.20985: checking for max_fail_percentage 12755 1727204104.20987: done checking for max_fail_percentage 12755 1727204104.20988: checking to see if all hosts have failed and the running result is not ok 12755 1727204104.20991: done checking to see if all hosts have failed 12755 1727204104.20992: getting the remaining hosts for this loop 12755 1727204104.20994: done getting the remaining hosts for this loop 12755 1727204104.20999: getting the next task for host managed-node1 12755 1727204104.21006: done getting next task for host managed-node1 12755 1727204104.21010: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 12755 1727204104.21014: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204104.21021: getting variables 12755 1727204104.21023: in VariableManager get_vars() 12755 1727204104.21078: Calling all_inventory to load vars for managed-node1 12755 1727204104.21082: Calling groups_inventory to load vars for managed-node1 12755 1727204104.21084: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204104.21229: Calling all_plugins_play to load vars for managed-node1 12755 1727204104.21234: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204104.21239: Calling groups_plugins_play to load vars for managed-node1 12755 1727204104.21984: WORKER PROCESS EXITING 12755 1727204104.26786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204104.33251: done with get_vars() 12755 1727204104.33307: done getting variables 12755 1727204104.33501: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204104.33755: variable 'profile' from source: include params 12755 1727204104.33760: variable 'item' from source: include params 12755 1727204104.33961: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:55:04 -0400 (0:00:00.165) 0:00:29.575 ***** 12755 1727204104.34004: entering _queue_task() for managed-node1/set_fact 12755 1727204104.35009: worker is 1 (out of 1 available) 12755 1727204104.35024: exiting _queue_task() for managed-node1/set_fact 12755 1727204104.35038: done queuing things up, now waiting for results queue to drain 12755 1727204104.35040: waiting for pending results... 12755 1727204104.35529: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 12755 1727204104.35896: in run() - task 12b410aa-8751-72e9-1a19-00000000067b 12755 1727204104.35914: variable 'ansible_search_path' from source: unknown 12755 1727204104.35920: variable 'ansible_search_path' from source: unknown 12755 1727204104.36159: calling self._execute() 12755 1727204104.36279: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204104.36287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204104.36301: variable 'omit' from source: magic vars 12755 1727204104.37287: variable 'ansible_distribution_major_version' from source: facts 12755 1727204104.37423: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204104.37680: variable 'profile_stat' from source: set_fact 12755 1727204104.37849: Evaluated conditional (profile_stat.stat.exists): False 12755 1727204104.37874: when evaluation is False, skipping this task 12755 1727204104.37878: _execute() done 12755 1727204104.37881: dumping result to json 12755 1727204104.37884: done dumping result, returning 12755 1727204104.37886: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [12b410aa-8751-72e9-1a19-00000000067b] 12755 1727204104.37958: sending task result for task 12b410aa-8751-72e9-1a19-00000000067b skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12755 1727204104.38155: no more pending results, returning what we have 12755 1727204104.38159: results queue empty 12755 1727204104.38161: checking for any_errors_fatal 12755 1727204104.38171: done checking for any_errors_fatal 12755 1727204104.38172: checking for max_fail_percentage 12755 1727204104.38174: done checking for max_fail_percentage 12755 1727204104.38175: checking to see if all hosts have failed and the running result is not ok 12755 1727204104.38183: done checking to see if all hosts have failed 12755 1727204104.38184: getting the remaining hosts for this loop 12755 1727204104.38186: done getting the remaining hosts for this loop 12755 1727204104.38194: getting the next task for host managed-node1 12755 1727204104.38204: done getting next task for host managed-node1 12755 1727204104.38207: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 12755 1727204104.38211: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204104.38216: getting variables 12755 1727204104.38221: in VariableManager get_vars() 12755 1727204104.38399: Calling all_inventory to load vars for managed-node1 12755 1727204104.38457: Calling groups_inventory to load vars for managed-node1 12755 1727204104.38461: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204104.38477: Calling all_plugins_play to load vars for managed-node1 12755 1727204104.38480: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204104.38484: Calling groups_plugins_play to load vars for managed-node1 12755 1727204104.39068: done sending task result for task 12b410aa-8751-72e9-1a19-00000000067b 12755 1727204104.39072: WORKER PROCESS EXITING 12755 1727204104.41202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204104.44938: done with get_vars() 12755 1727204104.44986: done getting variables 12755 1727204104.45067: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204104.45219: variable 'profile' from source: include params 12755 1727204104.45225: variable 'item' from source: include params 12755 1727204104.45305: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:55:04 -0400 (0:00:00.113) 0:00:29.689 ***** 12755 1727204104.45345: entering _queue_task() for managed-node1/command 12755 1727204104.45731: worker is 1 (out of 1 available) 12755 1727204104.45745: exiting _queue_task() for managed-node1/command 12755 1727204104.45760: done queuing things up, now waiting for results queue to drain 12755 1727204104.45761: waiting for pending results... 12755 1727204104.46083: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0.1 12755 1727204104.46419: in run() - task 12b410aa-8751-72e9-1a19-00000000067c 12755 1727204104.46425: variable 'ansible_search_path' from source: unknown 12755 1727204104.46429: variable 'ansible_search_path' from source: unknown 12755 1727204104.46433: calling self._execute() 12755 1727204104.46803: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204104.46807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204104.46811: variable 'omit' from source: magic vars 12755 1727204104.46991: variable 'ansible_distribution_major_version' from source: facts 12755 1727204104.47005: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204104.47169: variable 'profile_stat' from source: set_fact 12755 1727204104.47187: Evaluated conditional (profile_stat.stat.exists): False 12755 1727204104.47193: when evaluation is False, skipping this task 12755 1727204104.47196: _execute() done 12755 1727204104.47199: dumping result to json 12755 1727204104.47204: done dumping result, returning 12755 1727204104.47212: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0.1 [12b410aa-8751-72e9-1a19-00000000067c] 12755 1727204104.47219: sending task result for task 12b410aa-8751-72e9-1a19-00000000067c skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12755 1727204104.47381: no more pending results, returning what we have 12755 1727204104.47386: results queue empty 12755 1727204104.47387: checking for any_errors_fatal 12755 1727204104.47596: done checking for any_errors_fatal 12755 1727204104.47598: checking for max_fail_percentage 12755 1727204104.47601: done checking for max_fail_percentage 12755 1727204104.47601: checking to see if all hosts have failed and the running result is not ok 12755 1727204104.47603: done checking to see if all hosts have failed 12755 1727204104.47604: getting the remaining hosts for this loop 12755 1727204104.47605: done getting the remaining hosts for this loop 12755 1727204104.47610: getting the next task for host managed-node1 12755 1727204104.47620: done getting next task for host managed-node1 12755 1727204104.47624: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 12755 1727204104.47629: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204104.47634: getting variables 12755 1727204104.47636: in VariableManager get_vars() 12755 1727204104.47898: Calling all_inventory to load vars for managed-node1 12755 1727204104.47902: Calling groups_inventory to load vars for managed-node1 12755 1727204104.47906: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204104.47924: Calling all_plugins_play to load vars for managed-node1 12755 1727204104.47928: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204104.47934: Calling groups_plugins_play to load vars for managed-node1 12755 1727204104.48454: done sending task result for task 12b410aa-8751-72e9-1a19-00000000067c 12755 1727204104.48990: WORKER PROCESS EXITING 12755 1727204104.51504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204104.58173: done with get_vars() 12755 1727204104.58269: done getting variables 12755 1727204104.58358: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204104.58496: variable 'profile' from source: include params 12755 1727204104.58500: variable 'item' from source: include params 12755 1727204104.58597: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:55:04 -0400 (0:00:00.132) 0:00:29.822 ***** 12755 1727204104.58635: entering _queue_task() for managed-node1/set_fact 12755 1727204104.59031: worker is 1 (out of 1 available) 12755 1727204104.59046: exiting _queue_task() for managed-node1/set_fact 12755 1727204104.59060: done queuing things up, now waiting for results queue to drain 12755 1727204104.59061: waiting for pending results... 12755 1727204104.59445: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0.1 12755 1727204104.59562: in run() - task 12b410aa-8751-72e9-1a19-00000000067d 12755 1727204104.59583: variable 'ansible_search_path' from source: unknown 12755 1727204104.59594: variable 'ansible_search_path' from source: unknown 12755 1727204104.59649: calling self._execute() 12755 1727204104.59778: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204104.59792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204104.59810: variable 'omit' from source: magic vars 12755 1727204104.60308: variable 'ansible_distribution_major_version' from source: facts 12755 1727204104.60409: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204104.60501: variable 'profile_stat' from source: set_fact 12755 1727204104.60534: Evaluated conditional (profile_stat.stat.exists): False 12755 1727204104.60543: when evaluation is False, skipping this task 12755 1727204104.60627: _execute() done 12755 1727204104.60632: dumping result to json 12755 1727204104.60635: done dumping result, returning 12755 1727204104.60637: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [12b410aa-8751-72e9-1a19-00000000067d] 12755 1727204104.60640: sending task result for task 12b410aa-8751-72e9-1a19-00000000067d 12755 1727204104.60715: done sending task result for task 12b410aa-8751-72e9-1a19-00000000067d 12755 1727204104.60718: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12755 1727204104.60787: no more pending results, returning what we have 12755 1727204104.60793: results queue empty 12755 1727204104.60795: checking for any_errors_fatal 12755 1727204104.60803: done checking for any_errors_fatal 12755 1727204104.60804: checking for max_fail_percentage 12755 1727204104.60806: done checking for max_fail_percentage 12755 1727204104.60807: checking to see if all hosts have failed and the running result is not ok 12755 1727204104.60808: done checking to see if all hosts have failed 12755 1727204104.60809: getting the remaining hosts for this loop 12755 1727204104.60811: done getting the remaining hosts for this loop 12755 1727204104.60817: getting the next task for host managed-node1 12755 1727204104.60949: done getting next task for host managed-node1 12755 1727204104.60953: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 12755 1727204104.60988: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204104.60998: getting variables 12755 1727204104.61000: in VariableManager get_vars() 12755 1727204104.61335: Calling all_inventory to load vars for managed-node1 12755 1727204104.61339: Calling groups_inventory to load vars for managed-node1 12755 1727204104.61342: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204104.61360: Calling all_plugins_play to load vars for managed-node1 12755 1727204104.61364: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204104.61368: Calling groups_plugins_play to load vars for managed-node1 12755 1727204104.64175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204104.67400: done with get_vars() 12755 1727204104.67466: done getting variables 12755 1727204104.67550: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204104.67709: variable 'profile' from source: include params 12755 1727204104.67714: variable 'item' from source: include params 12755 1727204104.67793: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:55:04 -0400 (0:00:00.091) 0:00:29.914 ***** 12755 1727204104.67834: entering _queue_task() for managed-node1/assert 12755 1727204104.68410: worker is 1 (out of 1 available) 12755 1727204104.68425: exiting _queue_task() for managed-node1/assert 12755 1727204104.68438: done queuing things up, now waiting for results queue to drain 12755 1727204104.68439: waiting for pending results... 12755 1727204104.68925: running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0.1' 12755 1727204104.68931: in run() - task 12b410aa-8751-72e9-1a19-000000000364 12755 1727204104.68935: variable 'ansible_search_path' from source: unknown 12755 1727204104.68938: variable 'ansible_search_path' from source: unknown 12755 1727204104.68942: calling self._execute() 12755 1727204104.69109: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204104.69113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204104.69117: variable 'omit' from source: magic vars 12755 1727204104.69667: variable 'ansible_distribution_major_version' from source: facts 12755 1727204104.69682: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204104.69697: variable 'omit' from source: magic vars 12755 1727204104.69797: variable 'omit' from source: magic vars 12755 1727204104.69901: variable 'profile' from source: include params 12755 1727204104.69936: variable 'item' from source: include params 12755 1727204104.69996: variable 'item' from source: include params 12755 1727204104.70089: variable 'omit' from source: magic vars 12755 1727204104.70095: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204104.70125: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204104.70153: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204104.70174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204104.70195: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204104.70400: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204104.70404: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204104.70407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204104.70409: Set connection var ansible_connection to ssh 12755 1727204104.70411: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204104.70413: Set connection var ansible_shell_type to sh 12755 1727204104.70428: Set connection var ansible_timeout to 10 12755 1727204104.70438: Set connection var ansible_shell_executable to /bin/sh 12755 1727204104.70446: Set connection var ansible_pipelining to False 12755 1727204104.70478: variable 'ansible_shell_executable' from source: unknown 12755 1727204104.70481: variable 'ansible_connection' from source: unknown 12755 1727204104.70484: variable 'ansible_module_compression' from source: unknown 12755 1727204104.70494: variable 'ansible_shell_type' from source: unknown 12755 1727204104.70497: variable 'ansible_shell_executable' from source: unknown 12755 1727204104.70499: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204104.70509: variable 'ansible_pipelining' from source: unknown 12755 1727204104.70512: variable 'ansible_timeout' from source: unknown 12755 1727204104.70514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204104.70715: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204104.70896: variable 'omit' from source: magic vars 12755 1727204104.70900: starting attempt loop 12755 1727204104.70902: running the handler 12755 1727204104.70916: variable 'lsr_net_profile_exists' from source: set_fact 12755 1727204104.70919: Evaluated conditional (lsr_net_profile_exists): True 12755 1727204104.70932: handler run complete 12755 1727204104.70953: attempt loop complete, returning result 12755 1727204104.70956: _execute() done 12755 1727204104.70966: dumping result to json 12755 1727204104.70975: done dumping result, returning 12755 1727204104.70985: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0.1' [12b410aa-8751-72e9-1a19-000000000364] 12755 1727204104.70993: sending task result for task 12b410aa-8751-72e9-1a19-000000000364 12755 1727204104.71265: done sending task result for task 12b410aa-8751-72e9-1a19-000000000364 12755 1727204104.71269: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12755 1727204104.71334: no more pending results, returning what we have 12755 1727204104.71338: results queue empty 12755 1727204104.71339: checking for any_errors_fatal 12755 1727204104.71345: done checking for any_errors_fatal 12755 1727204104.71346: checking for max_fail_percentage 12755 1727204104.71348: done checking for max_fail_percentage 12755 1727204104.71349: checking to see if all hosts have failed and the running result is not ok 12755 1727204104.71350: done checking to see if all hosts have failed 12755 1727204104.71351: getting the remaining hosts for this loop 12755 1727204104.71353: done getting the remaining hosts for this loop 12755 1727204104.71357: getting the next task for host managed-node1 12755 1727204104.71364: done getting next task for host managed-node1 12755 1727204104.71368: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 12755 1727204104.71371: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204104.71375: getting variables 12755 1727204104.71377: in VariableManager get_vars() 12755 1727204104.71443: Calling all_inventory to load vars for managed-node1 12755 1727204104.71448: Calling groups_inventory to load vars for managed-node1 12755 1727204104.71451: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204104.71465: Calling all_plugins_play to load vars for managed-node1 12755 1727204104.71469: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204104.71473: Calling groups_plugins_play to load vars for managed-node1 12755 1727204104.75774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204104.78795: done with get_vars() 12755 1727204104.78850: done getting variables 12755 1727204104.78930: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204104.79072: variable 'profile' from source: include params 12755 1727204104.79077: variable 'item' from source: include params 12755 1727204104.79148: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:55:04 -0400 (0:00:00.113) 0:00:30.027 ***** 12755 1727204104.79193: entering _queue_task() for managed-node1/assert 12755 1727204104.79820: worker is 1 (out of 1 available) 12755 1727204104.79833: exiting _queue_task() for managed-node1/assert 12755 1727204104.79846: done queuing things up, now waiting for results queue to drain 12755 1727204104.79848: waiting for pending results... 12755 1727204104.80390: running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0.1' 12755 1727204104.80534: in run() - task 12b410aa-8751-72e9-1a19-000000000365 12755 1727204104.80550: variable 'ansible_search_path' from source: unknown 12755 1727204104.80554: variable 'ansible_search_path' from source: unknown 12755 1727204104.80700: calling self._execute() 12755 1727204104.80713: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204104.80725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204104.80744: variable 'omit' from source: magic vars 12755 1727204104.81203: variable 'ansible_distribution_major_version' from source: facts 12755 1727204104.81218: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204104.81230: variable 'omit' from source: magic vars 12755 1727204104.81287: variable 'omit' from source: magic vars 12755 1727204104.81419: variable 'profile' from source: include params 12755 1727204104.81464: variable 'item' from source: include params 12755 1727204104.81509: variable 'item' from source: include params 12755 1727204104.81537: variable 'omit' from source: magic vars 12755 1727204104.81586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204104.81636: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204104.81683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204104.81686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204104.81695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204104.81739: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204104.81742: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204104.81792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204104.81881: Set connection var ansible_connection to ssh 12755 1727204104.81888: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204104.81899: Set connection var ansible_shell_type to sh 12755 1727204104.81909: Set connection var ansible_timeout to 10 12755 1727204104.81917: Set connection var ansible_shell_executable to /bin/sh 12755 1727204104.82010: Set connection var ansible_pipelining to False 12755 1727204104.82014: variable 'ansible_shell_executable' from source: unknown 12755 1727204104.82017: variable 'ansible_connection' from source: unknown 12755 1727204104.82019: variable 'ansible_module_compression' from source: unknown 12755 1727204104.82021: variable 'ansible_shell_type' from source: unknown 12755 1727204104.82023: variable 'ansible_shell_executable' from source: unknown 12755 1727204104.82025: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204104.82027: variable 'ansible_pipelining' from source: unknown 12755 1727204104.82030: variable 'ansible_timeout' from source: unknown 12755 1727204104.82032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204104.82176: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204104.82192: variable 'omit' from source: magic vars 12755 1727204104.82200: starting attempt loop 12755 1727204104.82203: running the handler 12755 1727204104.82347: variable 'lsr_net_profile_ansible_managed' from source: set_fact 12755 1727204104.82354: Evaluated conditional (lsr_net_profile_ansible_managed): True 12755 1727204104.82363: handler run complete 12755 1727204104.82391: attempt loop complete, returning result 12755 1727204104.82395: _execute() done 12755 1727204104.82398: dumping result to json 12755 1727204104.82447: done dumping result, returning 12755 1727204104.82451: done running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0.1' [12b410aa-8751-72e9-1a19-000000000365] 12755 1727204104.82454: sending task result for task 12b410aa-8751-72e9-1a19-000000000365 12755 1727204104.82661: done sending task result for task 12b410aa-8751-72e9-1a19-000000000365 12755 1727204104.82666: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12755 1727204104.82719: no more pending results, returning what we have 12755 1727204104.82723: results queue empty 12755 1727204104.82724: checking for any_errors_fatal 12755 1727204104.82731: done checking for any_errors_fatal 12755 1727204104.82731: checking for max_fail_percentage 12755 1727204104.82733: done checking for max_fail_percentage 12755 1727204104.82734: checking to see if all hosts have failed and the running result is not ok 12755 1727204104.82735: done checking to see if all hosts have failed 12755 1727204104.82736: getting the remaining hosts for this loop 12755 1727204104.82737: done getting the remaining hosts for this loop 12755 1727204104.82742: getting the next task for host managed-node1 12755 1727204104.82748: done getting next task for host managed-node1 12755 1727204104.82751: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 12755 1727204104.82754: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204104.82758: getting variables 12755 1727204104.82760: in VariableManager get_vars() 12755 1727204104.82821: Calling all_inventory to load vars for managed-node1 12755 1727204104.82826: Calling groups_inventory to load vars for managed-node1 12755 1727204104.82830: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204104.82843: Calling all_plugins_play to load vars for managed-node1 12755 1727204104.82847: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204104.82851: Calling groups_plugins_play to load vars for managed-node1 12755 1727204104.87229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204104.91259: done with get_vars() 12755 1727204104.91331: done getting variables 12755 1727204104.91428: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204104.91591: variable 'profile' from source: include params 12755 1727204104.91595: variable 'item' from source: include params 12755 1727204104.91684: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:55:04 -0400 (0:00:00.125) 0:00:30.153 ***** 12755 1727204104.91734: entering _queue_task() for managed-node1/assert 12755 1727204104.92153: worker is 1 (out of 1 available) 12755 1727204104.92297: exiting _queue_task() for managed-node1/assert 12755 1727204104.92310: done queuing things up, now waiting for results queue to drain 12755 1727204104.92312: waiting for pending results... 12755 1727204104.92575: running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0.1 12755 1727204104.92761: in run() - task 12b410aa-8751-72e9-1a19-000000000366 12755 1727204104.92766: variable 'ansible_search_path' from source: unknown 12755 1727204104.92772: variable 'ansible_search_path' from source: unknown 12755 1727204104.92775: calling self._execute() 12755 1727204104.92996: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204104.93003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204104.93006: variable 'omit' from source: magic vars 12755 1727204104.93556: variable 'ansible_distribution_major_version' from source: facts 12755 1727204104.93842: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204104.93846: variable 'omit' from source: magic vars 12755 1727204104.93849: variable 'omit' from source: magic vars 12755 1727204104.93851: variable 'profile' from source: include params 12755 1727204104.93854: variable 'item' from source: include params 12755 1727204104.93876: variable 'item' from source: include params 12755 1727204104.94129: variable 'omit' from source: magic vars 12755 1727204104.94178: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204104.94224: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204104.94249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204104.94270: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204104.94286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204104.94552: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204104.94556: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204104.94561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204104.94925: Set connection var ansible_connection to ssh 12755 1727204104.94934: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204104.94938: Set connection var ansible_shell_type to sh 12755 1727204104.94957: Set connection var ansible_timeout to 10 12755 1727204104.94964: Set connection var ansible_shell_executable to /bin/sh 12755 1727204104.94972: Set connection var ansible_pipelining to False 12755 1727204104.95203: variable 'ansible_shell_executable' from source: unknown 12755 1727204104.95206: variable 'ansible_connection' from source: unknown 12755 1727204104.95209: variable 'ansible_module_compression' from source: unknown 12755 1727204104.95211: variable 'ansible_shell_type' from source: unknown 12755 1727204104.95213: variable 'ansible_shell_executable' from source: unknown 12755 1727204104.95215: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204104.95220: variable 'ansible_pipelining' from source: unknown 12755 1727204104.95223: variable 'ansible_timeout' from source: unknown 12755 1727204104.95225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204104.95400: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204104.95420: variable 'omit' from source: magic vars 12755 1727204104.95427: starting attempt loop 12755 1727204104.95430: running the handler 12755 1727204104.95668: variable 'lsr_net_profile_fingerprint' from source: set_fact 12755 1727204104.95675: Evaluated conditional (lsr_net_profile_fingerprint): True 12755 1727204104.95678: handler run complete 12755 1727204104.95681: attempt loop complete, returning result 12755 1727204104.95683: _execute() done 12755 1727204104.95686: dumping result to json 12755 1727204104.95692: done dumping result, returning 12755 1727204104.95695: done running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0.1 [12b410aa-8751-72e9-1a19-000000000366] 12755 1727204104.95697: sending task result for task 12b410aa-8751-72e9-1a19-000000000366 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12755 1727204104.95891: no more pending results, returning what we have 12755 1727204104.95896: results queue empty 12755 1727204104.95897: checking for any_errors_fatal 12755 1727204104.95906: done checking for any_errors_fatal 12755 1727204104.95907: checking for max_fail_percentage 12755 1727204104.95909: done checking for max_fail_percentage 12755 1727204104.95910: checking to see if all hosts have failed and the running result is not ok 12755 1727204104.95911: done checking to see if all hosts have failed 12755 1727204104.95912: getting the remaining hosts for this loop 12755 1727204104.95914: done getting the remaining hosts for this loop 12755 1727204104.95919: getting the next task for host managed-node1 12755 1727204104.95931: done getting next task for host managed-node1 12755 1727204104.95935: ^ task is: TASK: ** TEST check polling interval 12755 1727204104.95937: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204104.95942: getting variables 12755 1727204104.95944: in VariableManager get_vars() 12755 1727204104.96016: Calling all_inventory to load vars for managed-node1 12755 1727204104.96020: Calling groups_inventory to load vars for managed-node1 12755 1727204104.96024: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204104.96031: done sending task result for task 12b410aa-8751-72e9-1a19-000000000366 12755 1727204104.96035: WORKER PROCESS EXITING 12755 1727204104.96116: Calling all_plugins_play to load vars for managed-node1 12755 1727204104.96121: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204104.96126: Calling groups_plugins_play to load vars for managed-node1 12755 1727204104.98776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204105.01994: done with get_vars() 12755 1727204105.02045: done getting variables 12755 1727204105.02129: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:75 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.104) 0:00:30.257 ***** 12755 1727204105.02165: entering _queue_task() for managed-node1/command 12755 1727204105.02567: worker is 1 (out of 1 available) 12755 1727204105.02581: exiting _queue_task() for managed-node1/command 12755 1727204105.02801: done queuing things up, now waiting for results queue to drain 12755 1727204105.02803: waiting for pending results... 12755 1727204105.02970: running TaskExecutor() for managed-node1/TASK: ** TEST check polling interval 12755 1727204105.03139: in run() - task 12b410aa-8751-72e9-1a19-000000000071 12755 1727204105.03145: variable 'ansible_search_path' from source: unknown 12755 1727204105.03154: calling self._execute() 12755 1727204105.03268: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204105.03277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204105.03293: variable 'omit' from source: magic vars 12755 1727204105.04099: variable 'ansible_distribution_major_version' from source: facts 12755 1727204105.04104: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204105.04108: variable 'omit' from source: magic vars 12755 1727204105.04112: variable 'omit' from source: magic vars 12755 1727204105.04174: variable 'controller_device' from source: play vars 12755 1727204105.04201: variable 'omit' from source: magic vars 12755 1727204105.04321: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204105.04325: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204105.04328: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204105.04445: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204105.04449: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204105.04452: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204105.04456: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204105.04459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204105.04575: Set connection var ansible_connection to ssh 12755 1727204105.04586: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204105.04590: Set connection var ansible_shell_type to sh 12755 1727204105.04664: Set connection var ansible_timeout to 10 12755 1727204105.04667: Set connection var ansible_shell_executable to /bin/sh 12755 1727204105.04670: Set connection var ansible_pipelining to False 12755 1727204105.04673: variable 'ansible_shell_executable' from source: unknown 12755 1727204105.04675: variable 'ansible_connection' from source: unknown 12755 1727204105.04678: variable 'ansible_module_compression' from source: unknown 12755 1727204105.04681: variable 'ansible_shell_type' from source: unknown 12755 1727204105.04684: variable 'ansible_shell_executable' from source: unknown 12755 1727204105.04687: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204105.04772: variable 'ansible_pipelining' from source: unknown 12755 1727204105.04777: variable 'ansible_timeout' from source: unknown 12755 1727204105.04779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204105.04989: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204105.04995: variable 'omit' from source: magic vars 12755 1727204105.04998: starting attempt loop 12755 1727204105.05001: running the handler 12755 1727204105.05004: _low_level_execute_command(): starting 12755 1727204105.05007: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204105.05892: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204105.05926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204105.05943: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204105.05980: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204105.06050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204105.07874: stdout chunk (state=3): >>>/root <<< 12755 1727204105.08155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204105.08160: stdout chunk (state=3): >>><<< 12755 1727204105.08163: stderr chunk (state=3): >>><<< 12755 1727204105.08296: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204105.08300: _low_level_execute_command(): starting 12755 1727204105.08304: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204105.0819414-14639-199061042878855 `" && echo ansible-tmp-1727204105.0819414-14639-199061042878855="` echo /root/.ansible/tmp/ansible-tmp-1727204105.0819414-14639-199061042878855 `" ) && sleep 0' 12755 1727204105.09124: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204105.09130: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204105.09240: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204105.09251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204105.11299: stdout chunk (state=3): >>>ansible-tmp-1727204105.0819414-14639-199061042878855=/root/.ansible/tmp/ansible-tmp-1727204105.0819414-14639-199061042878855 <<< 12755 1727204105.11705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204105.11710: stdout chunk (state=3): >>><<< 12755 1727204105.11712: stderr chunk (state=3): >>><<< 12755 1727204105.11715: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204105.0819414-14639-199061042878855=/root/.ansible/tmp/ansible-tmp-1727204105.0819414-14639-199061042878855 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204105.11718: variable 'ansible_module_compression' from source: unknown 12755 1727204105.11720: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12755 1727204105.11723: variable 'ansible_facts' from source: unknown 12755 1727204105.11828: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204105.0819414-14639-199061042878855/AnsiballZ_command.py 12755 1727204105.12095: Sending initial data 12755 1727204105.12099: Sent initial data (156 bytes) 12755 1727204105.12648: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204105.12659: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204105.12671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204105.12789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204105.12823: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204105.12980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204105.14704: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204105.14788: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204105.14869: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpw5l39pmg /root/.ansible/tmp/ansible-tmp-1727204105.0819414-14639-199061042878855/AnsiballZ_command.py <<< 12755 1727204105.14875: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204105.0819414-14639-199061042878855/AnsiballZ_command.py" <<< 12755 1727204105.14977: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpw5l39pmg" to remote "/root/.ansible/tmp/ansible-tmp-1727204105.0819414-14639-199061042878855/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204105.0819414-14639-199061042878855/AnsiballZ_command.py" <<< 12755 1727204105.16362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204105.16409: stderr chunk (state=3): >>><<< 12755 1727204105.16424: stdout chunk (state=3): >>><<< 12755 1727204105.16459: done transferring module to remote 12755 1727204105.16492: _low_level_execute_command(): starting 12755 1727204105.16510: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204105.0819414-14639-199061042878855/ /root/.ansible/tmp/ansible-tmp-1727204105.0819414-14639-199061042878855/AnsiballZ_command.py && sleep 0' 12755 1727204105.17242: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204105.17273: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204105.17294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204105.17315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204105.17340: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204105.17414: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12755 1727204105.17422: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204105.17509: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204105.17541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204105.17619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204105.19858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204105.19862: stdout chunk (state=3): >>><<< 12755 1727204105.19865: stderr chunk (state=3): >>><<< 12755 1727204105.19872: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204105.19875: _low_level_execute_command(): starting 12755 1727204105.19878: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204105.0819414-14639-199061042878855/AnsiballZ_command.py && sleep 0' 12755 1727204105.20534: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204105.20547: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204105.20551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204105.20553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204105.20556: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204105.20558: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204105.20654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204105.20669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204105.20673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204105.20675: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204105.20776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204105.39197: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-24 14:55:05.384647", "end": "2024-09-24 14:55:05.389719", "delta": "0:00:00.005072", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12755 1727204105.41197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204105.41202: stdout chunk (state=3): >>><<< 12755 1727204105.41205: stderr chunk (state=3): >>><<< 12755 1727204105.41208: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-24 14:55:05.384647", "end": "2024-09-24 14:55:05.389719", "delta": "0:00:00.005072", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204105.41211: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/nm-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204105.0819414-14639-199061042878855/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204105.41214: _low_level_execute_command(): starting 12755 1727204105.41216: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204105.0819414-14639-199061042878855/ > /dev/null 2>&1 && sleep 0' 12755 1727204105.42047: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204105.42096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204105.42108: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204105.42131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204105.42210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204105.44368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204105.44440: stderr chunk (state=3): >>><<< 12755 1727204105.44445: stdout chunk (state=3): >>><<< 12755 1727204105.44491: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204105.44500: handler run complete 12755 1727204105.44550: Evaluated conditional (False): False 12755 1727204105.45195: variable 'result' from source: unknown 12755 1727204105.45199: Evaluated conditional ('110' in result.stdout): True 12755 1727204105.45207: attempt loop complete, returning result 12755 1727204105.45210: _execute() done 12755 1727204105.45212: dumping result to json 12755 1727204105.45215: done dumping result, returning 12755 1727204105.45409: done running TaskExecutor() for managed-node1/TASK: ** TEST check polling interval [12b410aa-8751-72e9-1a19-000000000071] 12755 1727204105.45413: sending task result for task 12b410aa-8751-72e9-1a19-000000000071 12755 1727204105.45542: done sending task result for task 12b410aa-8751-72e9-1a19-000000000071 12755 1727204105.45546: WORKER PROCESS EXITING ok: [managed-node1] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/nm-bond" ], "delta": "0:00:00.005072", "end": "2024-09-24 14:55:05.389719", "rc": 0, "start": "2024-09-24 14:55:05.384647" } STDOUT: MII Polling Interval (ms): 110 12755 1727204105.45656: no more pending results, returning what we have 12755 1727204105.45661: results queue empty 12755 1727204105.45663: checking for any_errors_fatal 12755 1727204105.45671: done checking for any_errors_fatal 12755 1727204105.45672: checking for max_fail_percentage 12755 1727204105.45674: done checking for max_fail_percentage 12755 1727204105.45675: checking to see if all hosts have failed and the running result is not ok 12755 1727204105.45676: done checking to see if all hosts have failed 12755 1727204105.45677: getting the remaining hosts for this loop 12755 1727204105.45679: done getting the remaining hosts for this loop 12755 1727204105.45684: getting the next task for host managed-node1 12755 1727204105.45695: done getting next task for host managed-node1 12755 1727204105.45699: ^ task is: TASK: ** TEST check IPv4 12755 1727204105.46094: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204105.46100: getting variables 12755 1727204105.46102: in VariableManager get_vars() 12755 1727204105.46162: Calling all_inventory to load vars for managed-node1 12755 1727204105.46165: Calling groups_inventory to load vars for managed-node1 12755 1727204105.46168: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204105.46181: Calling all_plugins_play to load vars for managed-node1 12755 1727204105.46184: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204105.46187: Calling groups_plugins_play to load vars for managed-node1 12755 1727204105.51183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204105.57609: done with get_vars() 12755 1727204105.57671: done getting variables 12755 1727204105.58050: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:80 Tuesday 24 September 2024 14:55:05 -0400 (0:00:00.559) 0:00:30.816 ***** 12755 1727204105.58084: entering _queue_task() for managed-node1/command 12755 1727204105.58856: worker is 1 (out of 1 available) 12755 1727204105.58870: exiting _queue_task() for managed-node1/command 12755 1727204105.58886: done queuing things up, now waiting for results queue to drain 12755 1727204105.58888: waiting for pending results... 12755 1727204105.59723: running TaskExecutor() for managed-node1/TASK: ** TEST check IPv4 12755 1727204105.59786: in run() - task 12b410aa-8751-72e9-1a19-000000000072 12755 1727204105.59806: variable 'ansible_search_path' from source: unknown 12755 1727204105.60000: calling self._execute() 12755 1727204105.60078: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204105.60086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204105.60323: variable 'omit' from source: magic vars 12755 1727204105.61366: variable 'ansible_distribution_major_version' from source: facts 12755 1727204105.61380: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204105.61390: variable 'omit' from source: magic vars 12755 1727204105.61420: variable 'omit' from source: magic vars 12755 1727204105.61655: variable 'controller_device' from source: play vars 12755 1727204105.61678: variable 'omit' from source: magic vars 12755 1727204105.61810: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204105.61864: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204105.61892: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204105.61914: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204105.61933: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204105.62086: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204105.62091: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204105.62171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204105.62421: Set connection var ansible_connection to ssh 12755 1727204105.62432: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204105.62435: Set connection var ansible_shell_type to sh 12755 1727204105.62450: Set connection var ansible_timeout to 10 12755 1727204105.62458: Set connection var ansible_shell_executable to /bin/sh 12755 1727204105.62466: Set connection var ansible_pipelining to False 12755 1727204105.62607: variable 'ansible_shell_executable' from source: unknown 12755 1727204105.62611: variable 'ansible_connection' from source: unknown 12755 1727204105.62614: variable 'ansible_module_compression' from source: unknown 12755 1727204105.62617: variable 'ansible_shell_type' from source: unknown 12755 1727204105.62707: variable 'ansible_shell_executable' from source: unknown 12755 1727204105.62711: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204105.62714: variable 'ansible_pipelining' from source: unknown 12755 1727204105.62716: variable 'ansible_timeout' from source: unknown 12755 1727204105.62717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204105.63038: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204105.63104: variable 'omit' from source: magic vars 12755 1727204105.63107: starting attempt loop 12755 1727204105.63110: running the handler 12755 1727204105.63134: _low_level_execute_command(): starting 12755 1727204105.63399: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204105.64820: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204105.64837: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204105.65012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204105.65276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204105.65319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204105.67322: stdout chunk (state=3): >>>/root <<< 12755 1727204105.67462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204105.67467: stdout chunk (state=3): >>><<< 12755 1727204105.67477: stderr chunk (state=3): >>><<< 12755 1727204105.67541: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204105.67560: _low_level_execute_command(): starting 12755 1727204105.67569: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204105.6754274-14657-211636684322466 `" && echo ansible-tmp-1727204105.6754274-14657-211636684322466="` echo /root/.ansible/tmp/ansible-tmp-1727204105.6754274-14657-211636684322466 `" ) && sleep 0' 12755 1727204105.68752: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204105.68904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204105.68917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204105.68998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204105.69002: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204105.69005: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204105.69022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204105.69040: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12755 1727204105.69291: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204105.69301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204105.69398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204105.71582: stdout chunk (state=3): >>>ansible-tmp-1727204105.6754274-14657-211636684322466=/root/.ansible/tmp/ansible-tmp-1727204105.6754274-14657-211636684322466 <<< 12755 1727204105.71903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204105.71907: stderr chunk (state=3): >>><<< 12755 1727204105.71909: stdout chunk (state=3): >>><<< 12755 1727204105.71933: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204105.6754274-14657-211636684322466=/root/.ansible/tmp/ansible-tmp-1727204105.6754274-14657-211636684322466 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204105.71975: variable 'ansible_module_compression' from source: unknown 12755 1727204105.72037: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12755 1727204105.72073: variable 'ansible_facts' from source: unknown 12755 1727204105.72573: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204105.6754274-14657-211636684322466/AnsiballZ_command.py 12755 1727204105.73118: Sending initial data 12755 1727204105.73126: Sent initial data (156 bytes) 12755 1727204105.74274: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204105.74437: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204105.74658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204105.74683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204105.77097: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204105.6754274-14657-211636684322466/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpltzq56lq" to remote "/root/.ansible/tmp/ansible-tmp-1727204105.6754274-14657-211636684322466/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204105.6754274-14657-211636684322466/AnsiballZ_command.py" <<< 12755 1727204105.77102: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpltzq56lq /root/.ansible/tmp/ansible-tmp-1727204105.6754274-14657-211636684322466/AnsiballZ_command.py <<< 12755 1727204105.78910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204105.79513: stderr chunk (state=3): >>><<< 12755 1727204105.79518: stdout chunk (state=3): >>><<< 12755 1727204105.79549: done transferring module to remote 12755 1727204105.79564: _low_level_execute_command(): starting 12755 1727204105.79570: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204105.6754274-14657-211636684322466/ /root/.ansible/tmp/ansible-tmp-1727204105.6754274-14657-211636684322466/AnsiballZ_command.py && sleep 0' 12755 1727204105.81374: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204105.81378: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204105.81381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204105.81409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204105.81550: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204105.81856: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204105.81875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204105.81969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204105.84117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204105.84276: stderr chunk (state=3): >>><<< 12755 1727204105.84290: stdout chunk (state=3): >>><<< 12755 1727204105.84318: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204105.84328: _low_level_execute_command(): starting 12755 1727204105.84378: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204105.6754274-14657-211636684322466/AnsiballZ_command.py && sleep 0' 12755 1727204105.85806: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204105.86009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204105.86119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204105.86140: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204105.86181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204105.86239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204106.05593: stdout chunk (state=3): >>> {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.217/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 233sec preferred_lft 233sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-24 14:55:06.050949", "end": "2024-09-24 14:55:06.054972", "delta": "0:00:00.004023", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12755 1727204106.07912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204106.07917: stdout chunk (state=3): >>><<< 12755 1727204106.07920: stderr chunk (state=3): >>><<< 12755 1727204106.07923: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.217/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 233sec preferred_lft 233sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-24 14:55:06.050949", "end": "2024-09-24 14:55:06.054972", "delta": "0:00:00.004023", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204106.07938: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204105.6754274-14657-211636684322466/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204106.07957: _low_level_execute_command(): starting 12755 1727204106.07969: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204105.6754274-14657-211636684322466/ > /dev/null 2>&1 && sleep 0' 12755 1727204106.09133: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204106.09167: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204106.09256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204106.09337: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204106.09365: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204106.09392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204106.09587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204106.11663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204106.12099: stderr chunk (state=3): >>><<< 12755 1727204106.12103: stdout chunk (state=3): >>><<< 12755 1727204106.12106: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204106.12109: handler run complete 12755 1727204106.12111: Evaluated conditional (False): False 12755 1727204106.12397: variable 'result' from source: set_fact 12755 1727204106.12480: Evaluated conditional ('192.0.2' in result.stdout): True 12755 1727204106.12551: attempt loop complete, returning result 12755 1727204106.12606: _execute() done 12755 1727204106.12616: dumping result to json 12755 1727204106.12627: done dumping result, returning 12755 1727204106.12648: done running TaskExecutor() for managed-node1/TASK: ** TEST check IPv4 [12b410aa-8751-72e9-1a19-000000000072] 12755 1727204106.12700: sending task result for task 12b410aa-8751-72e9-1a19-000000000072 ok: [managed-node1] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.004023", "end": "2024-09-24 14:55:06.054972", "rc": 0, "start": "2024-09-24 14:55:06.050949" } STDOUT: 19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.217/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 233sec preferred_lft 233sec 12755 1727204106.13215: no more pending results, returning what we have 12755 1727204106.13222: results queue empty 12755 1727204106.13224: checking for any_errors_fatal 12755 1727204106.13232: done checking for any_errors_fatal 12755 1727204106.13233: checking for max_fail_percentage 12755 1727204106.13236: done checking for max_fail_percentage 12755 1727204106.13237: checking to see if all hosts have failed and the running result is not ok 12755 1727204106.13238: done checking to see if all hosts have failed 12755 1727204106.13239: getting the remaining hosts for this loop 12755 1727204106.13241: done getting the remaining hosts for this loop 12755 1727204106.13247: getting the next task for host managed-node1 12755 1727204106.13254: done getting next task for host managed-node1 12755 1727204106.13258: ^ task is: TASK: ** TEST check IPv6 12755 1727204106.13261: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204106.13266: getting variables 12755 1727204106.13268: in VariableManager get_vars() 12755 1727204106.13459: Calling all_inventory to load vars for managed-node1 12755 1727204106.13464: Calling groups_inventory to load vars for managed-node1 12755 1727204106.13467: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204106.13483: Calling all_plugins_play to load vars for managed-node1 12755 1727204106.13487: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204106.13621: Calling groups_plugins_play to load vars for managed-node1 12755 1727204106.14232: done sending task result for task 12b410aa-8751-72e9-1a19-000000000072 12755 1727204106.14236: WORKER PROCESS EXITING 12755 1727204106.21232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204106.23672: done with get_vars() 12755 1727204106.23707: done getting variables 12755 1727204106.23760: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:87 Tuesday 24 September 2024 14:55:06 -0400 (0:00:00.657) 0:00:31.473 ***** 12755 1727204106.23788: entering _queue_task() for managed-node1/command 12755 1727204106.24092: worker is 1 (out of 1 available) 12755 1727204106.24108: exiting _queue_task() for managed-node1/command 12755 1727204106.24124: done queuing things up, now waiting for results queue to drain 12755 1727204106.24127: waiting for pending results... 12755 1727204106.24335: running TaskExecutor() for managed-node1/TASK: ** TEST check IPv6 12755 1727204106.24416: in run() - task 12b410aa-8751-72e9-1a19-000000000073 12755 1727204106.24429: variable 'ansible_search_path' from source: unknown 12755 1727204106.24467: calling self._execute() 12755 1727204106.24564: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204106.24571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204106.24586: variable 'omit' from source: magic vars 12755 1727204106.24927: variable 'ansible_distribution_major_version' from source: facts 12755 1727204106.24939: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204106.24946: variable 'omit' from source: magic vars 12755 1727204106.24965: variable 'omit' from source: magic vars 12755 1727204106.25051: variable 'controller_device' from source: play vars 12755 1727204106.25068: variable 'omit' from source: magic vars 12755 1727204106.25108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204106.25146: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204106.25164: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204106.25181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204106.25195: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204106.25224: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204106.25230: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204106.25233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204106.25324: Set connection var ansible_connection to ssh 12755 1727204106.25339: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204106.25344: Set connection var ansible_shell_type to sh 12755 1727204106.25385: Set connection var ansible_timeout to 10 12755 1727204106.25390: Set connection var ansible_shell_executable to /bin/sh 12755 1727204106.25402: Set connection var ansible_pipelining to False 12755 1727204106.25405: variable 'ansible_shell_executable' from source: unknown 12755 1727204106.25408: variable 'ansible_connection' from source: unknown 12755 1727204106.25411: variable 'ansible_module_compression' from source: unknown 12755 1727204106.25413: variable 'ansible_shell_type' from source: unknown 12755 1727204106.25416: variable 'ansible_shell_executable' from source: unknown 12755 1727204106.25421: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204106.25446: variable 'ansible_pipelining' from source: unknown 12755 1727204106.25483: variable 'ansible_timeout' from source: unknown 12755 1727204106.25486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204106.25796: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204106.25800: variable 'omit' from source: magic vars 12755 1727204106.25803: starting attempt loop 12755 1727204106.25805: running the handler 12755 1727204106.25807: _low_level_execute_command(): starting 12755 1727204106.25809: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204106.26526: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204106.26636: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204106.26640: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204106.26809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204106.26882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204106.28727: stdout chunk (state=3): >>>/root <<< 12755 1727204106.29003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204106.29008: stdout chunk (state=3): >>><<< 12755 1727204106.29011: stderr chunk (state=3): >>><<< 12755 1727204106.29023: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204106.29045: _low_level_execute_command(): starting 12755 1727204106.29063: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204106.2902322-14766-277330694430079 `" && echo ansible-tmp-1727204106.2902322-14766-277330694430079="` echo /root/.ansible/tmp/ansible-tmp-1727204106.2902322-14766-277330694430079 `" ) && sleep 0' 12755 1727204106.29547: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204106.29574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204106.29626: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204106.29645: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204106.29692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204106.31769: stdout chunk (state=3): >>>ansible-tmp-1727204106.2902322-14766-277330694430079=/root/.ansible/tmp/ansible-tmp-1727204106.2902322-14766-277330694430079 <<< 12755 1727204106.31886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204106.31925: stderr chunk (state=3): >>><<< 12755 1727204106.31929: stdout chunk (state=3): >>><<< 12755 1727204106.31949: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204106.2902322-14766-277330694430079=/root/.ansible/tmp/ansible-tmp-1727204106.2902322-14766-277330694430079 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204106.31987: variable 'ansible_module_compression' from source: unknown 12755 1727204106.32039: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12755 1727204106.32075: variable 'ansible_facts' from source: unknown 12755 1727204106.32135: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204106.2902322-14766-277330694430079/AnsiballZ_command.py 12755 1727204106.32258: Sending initial data 12755 1727204106.32262: Sent initial data (156 bytes) 12755 1727204106.32694: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204106.32730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204106.32734: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204106.32737: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204106.32741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204106.32748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204106.32802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204106.32806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204106.32852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204106.34537: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204106.34580: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204106.34619: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmp99je717k /root/.ansible/tmp/ansible-tmp-1727204106.2902322-14766-277330694430079/AnsiballZ_command.py <<< 12755 1727204106.34628: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204106.2902322-14766-277330694430079/AnsiballZ_command.py" <<< 12755 1727204106.34659: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmp99je717k" to remote "/root/.ansible/tmp/ansible-tmp-1727204106.2902322-14766-277330694430079/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204106.2902322-14766-277330694430079/AnsiballZ_command.py" <<< 12755 1727204106.35450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204106.35528: stderr chunk (state=3): >>><<< 12755 1727204106.35531: stdout chunk (state=3): >>><<< 12755 1727204106.35553: done transferring module to remote 12755 1727204106.35567: _low_level_execute_command(): starting 12755 1727204106.35570: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204106.2902322-14766-277330694430079/ /root/.ansible/tmp/ansible-tmp-1727204106.2902322-14766-277330694430079/AnsiballZ_command.py && sleep 0' 12755 1727204106.36026: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204106.36065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204106.36068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204106.36071: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204106.36073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204106.36075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204106.36134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204106.36136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204106.36187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204106.38169: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204106.38231: stderr chunk (state=3): >>><<< 12755 1727204106.38236: stdout chunk (state=3): >>><<< 12755 1727204106.38255: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204106.38258: _low_level_execute_command(): starting 12755 1727204106.38264: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204106.2902322-14766-277330694430079/AnsiballZ_command.py && sleep 0' 12755 1727204106.38769: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204106.38773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204106.38775: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 12755 1727204106.38778: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204106.38780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204106.38825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204106.38829: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204106.38905: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204106.57263: stdout chunk (state=3): >>> {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::11b/128 scope global dynamic noprefixroute \n valid_lft 233sec preferred_lft 233sec\n inet6 2001:db8::65b1:9563:8399:8ce1/64 scope global dynamic noprefixroute \n valid_lft 1794sec preferred_lft 1794sec\n inet6 fe80::1b16:2538:8aac:b163/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-24 14:55:06.567874", "end": "2024-09-24 14:55:06.571718", "delta": "0:00:00.003844", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12755 1727204106.59167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204106.59227: stderr chunk (state=3): >>><<< 12755 1727204106.59231: stdout chunk (state=3): >>><<< 12755 1727204106.59250: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::11b/128 scope global dynamic noprefixroute \n valid_lft 233sec preferred_lft 233sec\n inet6 2001:db8::65b1:9563:8399:8ce1/64 scope global dynamic noprefixroute \n valid_lft 1794sec preferred_lft 1794sec\n inet6 fe80::1b16:2538:8aac:b163/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-24 14:55:06.567874", "end": "2024-09-24 14:55:06.571718", "delta": "0:00:00.003844", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204106.59293: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204106.2902322-14766-277330694430079/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204106.59303: _low_level_execute_command(): starting 12755 1727204106.59310: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204106.2902322-14766-277330694430079/ > /dev/null 2>&1 && sleep 0' 12755 1727204106.59799: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204106.59802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204106.59811: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204106.59814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204106.59862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204106.59866: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204106.59968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204106.61974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204106.62034: stderr chunk (state=3): >>><<< 12755 1727204106.62038: stdout chunk (state=3): >>><<< 12755 1727204106.62054: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204106.62062: handler run complete 12755 1727204106.62088: Evaluated conditional (False): False 12755 1727204106.62231: variable 'result' from source: set_fact 12755 1727204106.62247: Evaluated conditional ('2001' in result.stdout): True 12755 1727204106.62259: attempt loop complete, returning result 12755 1727204106.62264: _execute() done 12755 1727204106.62267: dumping result to json 12755 1727204106.62274: done dumping result, returning 12755 1727204106.62282: done running TaskExecutor() for managed-node1/TASK: ** TEST check IPv6 [12b410aa-8751-72e9-1a19-000000000073] 12755 1727204106.62287: sending task result for task 12b410aa-8751-72e9-1a19-000000000073 12755 1727204106.62405: done sending task result for task 12b410aa-8751-72e9-1a19-000000000073 12755 1727204106.62408: WORKER PROCESS EXITING ok: [managed-node1] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003844", "end": "2024-09-24 14:55:06.571718", "rc": 0, "start": "2024-09-24 14:55:06.567874" } STDOUT: 19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::11b/128 scope global dynamic noprefixroute valid_lft 233sec preferred_lft 233sec inet6 2001:db8::65b1:9563:8399:8ce1/64 scope global dynamic noprefixroute valid_lft 1794sec preferred_lft 1794sec inet6 fe80::1b16:2538:8aac:b163/64 scope link noprefixroute valid_lft forever preferred_lft forever 12755 1727204106.62514: no more pending results, returning what we have 12755 1727204106.62520: results queue empty 12755 1727204106.62521: checking for any_errors_fatal 12755 1727204106.62528: done checking for any_errors_fatal 12755 1727204106.62529: checking for max_fail_percentage 12755 1727204106.62531: done checking for max_fail_percentage 12755 1727204106.62532: checking to see if all hosts have failed and the running result is not ok 12755 1727204106.62533: done checking to see if all hosts have failed 12755 1727204106.62534: getting the remaining hosts for this loop 12755 1727204106.62535: done getting the remaining hosts for this loop 12755 1727204106.62540: getting the next task for host managed-node1 12755 1727204106.62548: done getting next task for host managed-node1 12755 1727204106.62554: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12755 1727204106.62556: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204106.62574: getting variables 12755 1727204106.62576: in VariableManager get_vars() 12755 1727204106.62642: Calling all_inventory to load vars for managed-node1 12755 1727204106.62645: Calling groups_inventory to load vars for managed-node1 12755 1727204106.62648: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204106.62660: Calling all_plugins_play to load vars for managed-node1 12755 1727204106.62663: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204106.62666: Calling groups_plugins_play to load vars for managed-node1 12755 1727204106.64401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204106.66598: done with get_vars() 12755 1727204106.66623: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:55:06 -0400 (0:00:00.429) 0:00:31.902 ***** 12755 1727204106.66709: entering _queue_task() for managed-node1/include_tasks 12755 1727204106.66983: worker is 1 (out of 1 available) 12755 1727204106.67001: exiting _queue_task() for managed-node1/include_tasks 12755 1727204106.67015: done queuing things up, now waiting for results queue to drain 12755 1727204106.67017: waiting for pending results... 12755 1727204106.67219: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12755 1727204106.67350: in run() - task 12b410aa-8751-72e9-1a19-00000000007b 12755 1727204106.67364: variable 'ansible_search_path' from source: unknown 12755 1727204106.67367: variable 'ansible_search_path' from source: unknown 12755 1727204106.67402: calling self._execute() 12755 1727204106.67493: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204106.67502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204106.67512: variable 'omit' from source: magic vars 12755 1727204106.68085: variable 'ansible_distribution_major_version' from source: facts 12755 1727204106.68088: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204106.68094: _execute() done 12755 1727204106.68099: dumping result to json 12755 1727204106.68101: done dumping result, returning 12755 1727204106.68104: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-72e9-1a19-00000000007b] 12755 1727204106.68106: sending task result for task 12b410aa-8751-72e9-1a19-00000000007b 12755 1727204106.68175: done sending task result for task 12b410aa-8751-72e9-1a19-00000000007b 12755 1727204106.68178: WORKER PROCESS EXITING 12755 1727204106.68320: no more pending results, returning what we have 12755 1727204106.68326: in VariableManager get_vars() 12755 1727204106.68393: Calling all_inventory to load vars for managed-node1 12755 1727204106.68397: Calling groups_inventory to load vars for managed-node1 12755 1727204106.68400: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204106.68413: Calling all_plugins_play to load vars for managed-node1 12755 1727204106.68417: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204106.68421: Calling groups_plugins_play to load vars for managed-node1 12755 1727204106.70068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204106.72046: done with get_vars() 12755 1727204106.72079: variable 'ansible_search_path' from source: unknown 12755 1727204106.72081: variable 'ansible_search_path' from source: unknown 12755 1727204106.72129: we have included files to process 12755 1727204106.72130: generating all_blocks data 12755 1727204106.72134: done generating all_blocks data 12755 1727204106.72139: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12755 1727204106.72140: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12755 1727204106.72142: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12755 1727204106.72838: done processing included file 12755 1727204106.72840: iterating over new_blocks loaded from include file 12755 1727204106.72842: in VariableManager get_vars() 12755 1727204106.72882: done with get_vars() 12755 1727204106.72884: filtering new block on tags 12755 1727204106.72909: done filtering new block on tags 12755 1727204106.72912: in VariableManager get_vars() 12755 1727204106.72952: done with get_vars() 12755 1727204106.72954: filtering new block on tags 12755 1727204106.72982: done filtering new block on tags 12755 1727204106.72985: in VariableManager get_vars() 12755 1727204106.73028: done with get_vars() 12755 1727204106.73031: filtering new block on tags 12755 1727204106.73055: done filtering new block on tags 12755 1727204106.73058: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 12755 1727204106.73064: extending task lists for all hosts with included blocks 12755 1727204106.74170: done extending task lists 12755 1727204106.74171: done processing included files 12755 1727204106.74172: results queue empty 12755 1727204106.74173: checking for any_errors_fatal 12755 1727204106.74179: done checking for any_errors_fatal 12755 1727204106.74180: checking for max_fail_percentage 12755 1727204106.74181: done checking for max_fail_percentage 12755 1727204106.74182: checking to see if all hosts have failed and the running result is not ok 12755 1727204106.74183: done checking to see if all hosts have failed 12755 1727204106.74184: getting the remaining hosts for this loop 12755 1727204106.74186: done getting the remaining hosts for this loop 12755 1727204106.74190: getting the next task for host managed-node1 12755 1727204106.74196: done getting next task for host managed-node1 12755 1727204106.74199: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12755 1727204106.74202: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204106.74215: getting variables 12755 1727204106.74217: in VariableManager get_vars() 12755 1727204106.74243: Calling all_inventory to load vars for managed-node1 12755 1727204106.74247: Calling groups_inventory to load vars for managed-node1 12755 1727204106.74250: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204106.74256: Calling all_plugins_play to load vars for managed-node1 12755 1727204106.74260: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204106.74264: Calling groups_plugins_play to load vars for managed-node1 12755 1727204106.76433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204106.79200: done with get_vars() 12755 1727204106.79249: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:55:06 -0400 (0:00:00.126) 0:00:32.029 ***** 12755 1727204106.79359: entering _queue_task() for managed-node1/setup 12755 1727204106.80005: worker is 1 (out of 1 available) 12755 1727204106.80020: exiting _queue_task() for managed-node1/setup 12755 1727204106.80034: done queuing things up, now waiting for results queue to drain 12755 1727204106.80036: waiting for pending results... 12755 1727204106.80651: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12755 1727204106.80838: in run() - task 12b410aa-8751-72e9-1a19-0000000006c5 12755 1727204106.80851: variable 'ansible_search_path' from source: unknown 12755 1727204106.80861: variable 'ansible_search_path' from source: unknown 12755 1727204106.80948: calling self._execute() 12755 1727204106.81038: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204106.81058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204106.81076: variable 'omit' from source: magic vars 12755 1727204106.81547: variable 'ansible_distribution_major_version' from source: facts 12755 1727204106.81597: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204106.81873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204106.84547: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204106.84694: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204106.84701: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204106.84756: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204106.84795: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204106.84900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204106.84946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204106.84992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204106.85081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204106.85084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204106.85149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204106.85189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204106.85230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204106.85296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204106.85407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204106.85533: variable '__network_required_facts' from source: role '' defaults 12755 1727204106.85548: variable 'ansible_facts' from source: unknown 12755 1727204106.88079: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 12755 1727204106.88414: when evaluation is False, skipping this task 12755 1727204106.88420: _execute() done 12755 1727204106.88423: dumping result to json 12755 1727204106.88426: done dumping result, returning 12755 1727204106.88429: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-72e9-1a19-0000000006c5] 12755 1727204106.88432: sending task result for task 12b410aa-8751-72e9-1a19-0000000006c5 12755 1727204106.88516: done sending task result for task 12b410aa-8751-72e9-1a19-0000000006c5 12755 1727204106.88524: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204106.88577: no more pending results, returning what we have 12755 1727204106.88581: results queue empty 12755 1727204106.88583: checking for any_errors_fatal 12755 1727204106.88585: done checking for any_errors_fatal 12755 1727204106.88586: checking for max_fail_percentage 12755 1727204106.88588: done checking for max_fail_percentage 12755 1727204106.88592: checking to see if all hosts have failed and the running result is not ok 12755 1727204106.88593: done checking to see if all hosts have failed 12755 1727204106.88594: getting the remaining hosts for this loop 12755 1727204106.88595: done getting the remaining hosts for this loop 12755 1727204106.88602: getting the next task for host managed-node1 12755 1727204106.88614: done getting next task for host managed-node1 12755 1727204106.88618: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 12755 1727204106.88623: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204106.88643: getting variables 12755 1727204106.88645: in VariableManager get_vars() 12755 1727204106.88712: Calling all_inventory to load vars for managed-node1 12755 1727204106.88715: Calling groups_inventory to load vars for managed-node1 12755 1727204106.88718: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204106.88731: Calling all_plugins_play to load vars for managed-node1 12755 1727204106.88734: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204106.88737: Calling groups_plugins_play to load vars for managed-node1 12755 1727204106.94836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204107.01443: done with get_vars() 12755 1727204107.01499: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:55:07 -0400 (0:00:00.224) 0:00:32.254 ***** 12755 1727204107.01855: entering _queue_task() for managed-node1/stat 12755 1727204107.02663: worker is 1 (out of 1 available) 12755 1727204107.02679: exiting _queue_task() for managed-node1/stat 12755 1727204107.02804: done queuing things up, now waiting for results queue to drain 12755 1727204107.02806: waiting for pending results... 12755 1727204107.03247: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 12755 1727204107.03635: in run() - task 12b410aa-8751-72e9-1a19-0000000006c7 12755 1727204107.03640: variable 'ansible_search_path' from source: unknown 12755 1727204107.03642: variable 'ansible_search_path' from source: unknown 12755 1727204107.03783: calling self._execute() 12755 1727204107.03984: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204107.03993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204107.04010: variable 'omit' from source: magic vars 12755 1727204107.04855: variable 'ansible_distribution_major_version' from source: facts 12755 1727204107.04882: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204107.05479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204107.06195: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204107.06199: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204107.06202: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204107.06323: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204107.06420: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204107.06449: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204107.06481: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204107.06721: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204107.06823: variable '__network_is_ostree' from source: set_fact 12755 1727204107.06832: Evaluated conditional (not __network_is_ostree is defined): False 12755 1727204107.06835: when evaluation is False, skipping this task 12755 1727204107.06838: _execute() done 12755 1727204107.06844: dumping result to json 12755 1727204107.06847: done dumping result, returning 12755 1727204107.06865: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-72e9-1a19-0000000006c7] 12755 1727204107.06868: sending task result for task 12b410aa-8751-72e9-1a19-0000000006c7 12755 1727204107.07178: done sending task result for task 12b410aa-8751-72e9-1a19-0000000006c7 12755 1727204107.07182: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12755 1727204107.07355: no more pending results, returning what we have 12755 1727204107.07358: results queue empty 12755 1727204107.07360: checking for any_errors_fatal 12755 1727204107.07368: done checking for any_errors_fatal 12755 1727204107.07369: checking for max_fail_percentage 12755 1727204107.07371: done checking for max_fail_percentage 12755 1727204107.07372: checking to see if all hosts have failed and the running result is not ok 12755 1727204107.07374: done checking to see if all hosts have failed 12755 1727204107.07375: getting the remaining hosts for this loop 12755 1727204107.07376: done getting the remaining hosts for this loop 12755 1727204107.07381: getting the next task for host managed-node1 12755 1727204107.07391: done getting next task for host managed-node1 12755 1727204107.07396: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12755 1727204107.07417: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204107.07441: getting variables 12755 1727204107.07443: in VariableManager get_vars() 12755 1727204107.07626: Calling all_inventory to load vars for managed-node1 12755 1727204107.07630: Calling groups_inventory to load vars for managed-node1 12755 1727204107.07633: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204107.07644: Calling all_plugins_play to load vars for managed-node1 12755 1727204107.07647: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204107.07651: Calling groups_plugins_play to load vars for managed-node1 12755 1727204107.13070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204107.20269: done with get_vars() 12755 1727204107.20366: done getting variables 12755 1727204107.20564: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:55:07 -0400 (0:00:00.187) 0:00:32.441 ***** 12755 1727204107.20610: entering _queue_task() for managed-node1/set_fact 12755 1727204107.21434: worker is 1 (out of 1 available) 12755 1727204107.21449: exiting _queue_task() for managed-node1/set_fact 12755 1727204107.21464: done queuing things up, now waiting for results queue to drain 12755 1727204107.21465: waiting for pending results... 12755 1727204107.21913: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12755 1727204107.22387: in run() - task 12b410aa-8751-72e9-1a19-0000000006c8 12755 1727204107.22393: variable 'ansible_search_path' from source: unknown 12755 1727204107.22396: variable 'ansible_search_path' from source: unknown 12755 1727204107.22399: calling self._execute() 12755 1727204107.22750: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204107.22755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204107.22758: variable 'omit' from source: magic vars 12755 1727204107.23727: variable 'ansible_distribution_major_version' from source: facts 12755 1727204107.23731: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204107.23950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204107.24661: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204107.24720: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204107.24759: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204107.24999: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204107.25155: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204107.25159: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204107.25165: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204107.25403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204107.25507: variable '__network_is_ostree' from source: set_fact 12755 1727204107.25515: Evaluated conditional (not __network_is_ostree is defined): False 12755 1727204107.25522: when evaluation is False, skipping this task 12755 1727204107.25524: _execute() done 12755 1727204107.25527: dumping result to json 12755 1727204107.25530: done dumping result, returning 12755 1727204107.25541: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-72e9-1a19-0000000006c8] 12755 1727204107.25547: sending task result for task 12b410aa-8751-72e9-1a19-0000000006c8 12755 1727204107.25657: done sending task result for task 12b410aa-8751-72e9-1a19-0000000006c8 12755 1727204107.25661: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12755 1727204107.25740: no more pending results, returning what we have 12755 1727204107.25745: results queue empty 12755 1727204107.25747: checking for any_errors_fatal 12755 1727204107.25755: done checking for any_errors_fatal 12755 1727204107.25757: checking for max_fail_percentage 12755 1727204107.25759: done checking for max_fail_percentage 12755 1727204107.25760: checking to see if all hosts have failed and the running result is not ok 12755 1727204107.25762: done checking to see if all hosts have failed 12755 1727204107.25763: getting the remaining hosts for this loop 12755 1727204107.25765: done getting the remaining hosts for this loop 12755 1727204107.25770: getting the next task for host managed-node1 12755 1727204107.25782: done getting next task for host managed-node1 12755 1727204107.25786: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 12755 1727204107.25793: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204107.25815: getting variables 12755 1727204107.25819: in VariableManager get_vars() 12755 1727204107.25883: Calling all_inventory to load vars for managed-node1 12755 1727204107.25886: Calling groups_inventory to load vars for managed-node1 12755 1727204107.26006: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204107.26022: Calling all_plugins_play to load vars for managed-node1 12755 1727204107.26026: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204107.26030: Calling groups_plugins_play to load vars for managed-node1 12755 1727204107.30138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204107.33663: done with get_vars() 12755 1727204107.33709: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:55:07 -0400 (0:00:00.132) 0:00:32.574 ***** 12755 1727204107.33912: entering _queue_task() for managed-node1/service_facts 12755 1727204107.34451: worker is 1 (out of 1 available) 12755 1727204107.34465: exiting _queue_task() for managed-node1/service_facts 12755 1727204107.34478: done queuing things up, now waiting for results queue to drain 12755 1727204107.34480: waiting for pending results... 12755 1727204107.34925: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 12755 1727204107.34941: in run() - task 12b410aa-8751-72e9-1a19-0000000006ca 12755 1727204107.34964: variable 'ansible_search_path' from source: unknown 12755 1727204107.34974: variable 'ansible_search_path' from source: unknown 12755 1727204107.35032: calling self._execute() 12755 1727204107.35153: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204107.35168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204107.35186: variable 'omit' from source: magic vars 12755 1727204107.35660: variable 'ansible_distribution_major_version' from source: facts 12755 1727204107.35687: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204107.35703: variable 'omit' from source: magic vars 12755 1727204107.35822: variable 'omit' from source: magic vars 12755 1727204107.35873: variable 'omit' from source: magic vars 12755 1727204107.36000: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204107.36004: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204107.36016: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204107.36045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204107.36065: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204107.36114: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204107.36128: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204107.36137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204107.36274: Set connection var ansible_connection to ssh 12755 1727204107.36290: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204107.36300: Set connection var ansible_shell_type to sh 12755 1727204107.36330: Set connection var ansible_timeout to 10 12755 1727204107.36439: Set connection var ansible_shell_executable to /bin/sh 12755 1727204107.36443: Set connection var ansible_pipelining to False 12755 1727204107.36445: variable 'ansible_shell_executable' from source: unknown 12755 1727204107.36448: variable 'ansible_connection' from source: unknown 12755 1727204107.36456: variable 'ansible_module_compression' from source: unknown 12755 1727204107.36458: variable 'ansible_shell_type' from source: unknown 12755 1727204107.36460: variable 'ansible_shell_executable' from source: unknown 12755 1727204107.36462: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204107.36464: variable 'ansible_pipelining' from source: unknown 12755 1727204107.36492: variable 'ansible_timeout' from source: unknown 12755 1727204107.36503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204107.37196: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204107.37201: variable 'omit' from source: magic vars 12755 1727204107.37204: starting attempt loop 12755 1727204107.37206: running the handler 12755 1727204107.37211: _low_level_execute_command(): starting 12755 1727204107.37213: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204107.38495: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204107.38500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204107.38503: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204107.38506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204107.38658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204107.38780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204107.38865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204107.40732: stdout chunk (state=3): >>>/root <<< 12755 1727204107.40860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204107.41044: stderr chunk (state=3): >>><<< 12755 1727204107.41048: stdout chunk (state=3): >>><<< 12755 1727204107.41052: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204107.41069: _low_level_execute_command(): starting 12755 1727204107.41398: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204107.410539-14884-255131341281762 `" && echo ansible-tmp-1727204107.410539-14884-255131341281762="` echo /root/.ansible/tmp/ansible-tmp-1727204107.410539-14884-255131341281762 `" ) && sleep 0' 12755 1727204107.42597: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204107.42628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204107.42639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204107.42835: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204107.42938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204107.45005: stdout chunk (state=3): >>>ansible-tmp-1727204107.410539-14884-255131341281762=/root/.ansible/tmp/ansible-tmp-1727204107.410539-14884-255131341281762 <<< 12755 1727204107.45219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204107.45312: stderr chunk (state=3): >>><<< 12755 1727204107.45316: stdout chunk (state=3): >>><<< 12755 1727204107.45341: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204107.410539-14884-255131341281762=/root/.ansible/tmp/ansible-tmp-1727204107.410539-14884-255131341281762 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204107.45406: variable 'ansible_module_compression' from source: unknown 12755 1727204107.45700: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 12755 1727204107.45703: variable 'ansible_facts' from source: unknown 12755 1727204107.45798: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204107.410539-14884-255131341281762/AnsiballZ_service_facts.py 12755 1727204107.46207: Sending initial data 12755 1727204107.46254: Sent initial data (161 bytes) 12755 1727204107.47898: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204107.47904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204107.48021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204107.48037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204107.48127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204107.50003: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204107.50047: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204107.50171: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmppa5k3pf3 /root/.ansible/tmp/ansible-tmp-1727204107.410539-14884-255131341281762/AnsiballZ_service_facts.py <<< 12755 1727204107.50176: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204107.410539-14884-255131341281762/AnsiballZ_service_facts.py" <<< 12755 1727204107.50231: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmppa5k3pf3" to remote "/root/.ansible/tmp/ansible-tmp-1727204107.410539-14884-255131341281762/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204107.410539-14884-255131341281762/AnsiballZ_service_facts.py" <<< 12755 1727204107.52608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204107.52612: stderr chunk (state=3): >>><<< 12755 1727204107.52615: stdout chunk (state=3): >>><<< 12755 1727204107.52617: done transferring module to remote 12755 1727204107.52619: _low_level_execute_command(): starting 12755 1727204107.52621: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204107.410539-14884-255131341281762/ /root/.ansible/tmp/ansible-tmp-1727204107.410539-14884-255131341281762/AnsiballZ_service_facts.py && sleep 0' 12755 1727204107.53464: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204107.53478: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204107.53508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204107.53597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204107.53607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204107.53631: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204107.53644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204107.53724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204107.55958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204107.56248: stderr chunk (state=3): >>><<< 12755 1727204107.56251: stdout chunk (state=3): >>><<< 12755 1727204107.56254: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204107.56257: _low_level_execute_command(): starting 12755 1727204107.56259: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204107.410539-14884-255131341281762/AnsiballZ_service_facts.py && sleep 0' 12755 1727204107.57029: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204107.57045: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204107.57060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204107.57178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 12755 1727204107.57196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204107.57216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204107.57509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204109.72676: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"n<<< 12755 1727204109.72702: stdout chunk (state=3): >>>ame": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name"<<< 12755 1727204109.72721: stdout chunk (state=3): >>>: "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zr<<< 12755 1727204109.72829: stdout chunk (state=3): >>>am0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": <<< 12755 1727204109.72853: stdout chunk (state=3): >>>"inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"},<<< 12755 1727204109.72874: stdout chunk (state=3): >>> "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 12755 1727204109.74798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204109.74802: stdout chunk (state=3): >>><<< 12755 1727204109.74805: stderr chunk (state=3): >>><<< 12755 1727204109.74813: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204109.75716: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204107.410539-14884-255131341281762/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204109.75731: _low_level_execute_command(): starting 12755 1727204109.75736: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204107.410539-14884-255131341281762/ > /dev/null 2>&1 && sleep 0' 12755 1727204109.76175: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204109.76194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204109.76198: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204109.76201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204109.76204: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 12755 1727204109.76227: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204109.76230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204109.76287: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204109.76292: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204109.76347: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204109.78696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204109.78701: stdout chunk (state=3): >>><<< 12755 1727204109.78703: stderr chunk (state=3): >>><<< 12755 1727204109.78706: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204109.78709: handler run complete 12755 1727204109.78841: variable 'ansible_facts' from source: unknown 12755 1727204109.79068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204109.79883: variable 'ansible_facts' from source: unknown 12755 1727204109.80082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204109.80441: attempt loop complete, returning result 12755 1727204109.80448: _execute() done 12755 1727204109.80451: dumping result to json 12755 1727204109.80536: done dumping result, returning 12755 1727204109.80547: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-72e9-1a19-0000000006ca] 12755 1727204109.80553: sending task result for task 12b410aa-8751-72e9-1a19-0000000006ca 12755 1727204109.82463: done sending task result for task 12b410aa-8751-72e9-1a19-0000000006ca ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204109.82593: no more pending results, returning what we have 12755 1727204109.82597: results queue empty 12755 1727204109.82598: checking for any_errors_fatal 12755 1727204109.82603: done checking for any_errors_fatal 12755 1727204109.82604: checking for max_fail_percentage 12755 1727204109.82606: done checking for max_fail_percentage 12755 1727204109.82607: checking to see if all hosts have failed and the running result is not ok 12755 1727204109.82608: done checking to see if all hosts have failed 12755 1727204109.82609: getting the remaining hosts for this loop 12755 1727204109.82610: done getting the remaining hosts for this loop 12755 1727204109.82615: getting the next task for host managed-node1 12755 1727204109.82625: done getting next task for host managed-node1 12755 1727204109.82629: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 12755 1727204109.82634: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204109.82795: getting variables 12755 1727204109.82797: in VariableManager get_vars() 12755 1727204109.82851: Calling all_inventory to load vars for managed-node1 12755 1727204109.82854: Calling groups_inventory to load vars for managed-node1 12755 1727204109.82857: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204109.82871: WORKER PROCESS EXITING 12755 1727204109.82883: Calling all_plugins_play to load vars for managed-node1 12755 1727204109.82887: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204109.82894: Calling groups_plugins_play to load vars for managed-node1 12755 1727204109.85294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204109.89643: done with get_vars() 12755 1727204109.89707: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:55:09 -0400 (0:00:02.559) 0:00:35.134 ***** 12755 1727204109.89876: entering _queue_task() for managed-node1/package_facts 12755 1727204109.90305: worker is 1 (out of 1 available) 12755 1727204109.90323: exiting _queue_task() for managed-node1/package_facts 12755 1727204109.90340: done queuing things up, now waiting for results queue to drain 12755 1727204109.90342: waiting for pending results... 12755 1727204109.90606: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 12755 1727204109.90777: in run() - task 12b410aa-8751-72e9-1a19-0000000006cb 12755 1727204109.90795: variable 'ansible_search_path' from source: unknown 12755 1727204109.90802: variable 'ansible_search_path' from source: unknown 12755 1727204109.90838: calling self._execute() 12755 1727204109.90947: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204109.90955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204109.90968: variable 'omit' from source: magic vars 12755 1727204109.91395: variable 'ansible_distribution_major_version' from source: facts 12755 1727204109.91445: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204109.91451: variable 'omit' from source: magic vars 12755 1727204109.91513: variable 'omit' from source: magic vars 12755 1727204109.91557: variable 'omit' from source: magic vars 12755 1727204109.91603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204109.91663: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204109.91667: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204109.91696: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204109.91753: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204109.91793: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204109.91797: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204109.91800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204109.92060: Set connection var ansible_connection to ssh 12755 1727204109.92068: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204109.92071: Set connection var ansible_shell_type to sh 12755 1727204109.92087: Set connection var ansible_timeout to 10 12755 1727204109.92206: Set connection var ansible_shell_executable to /bin/sh 12755 1727204109.92210: Set connection var ansible_pipelining to False 12755 1727204109.92238: variable 'ansible_shell_executable' from source: unknown 12755 1727204109.92241: variable 'ansible_connection' from source: unknown 12755 1727204109.92245: variable 'ansible_module_compression' from source: unknown 12755 1727204109.92316: variable 'ansible_shell_type' from source: unknown 12755 1727204109.92324: variable 'ansible_shell_executable' from source: unknown 12755 1727204109.92327: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204109.92330: variable 'ansible_pipelining' from source: unknown 12755 1727204109.92332: variable 'ansible_timeout' from source: unknown 12755 1727204109.92342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204109.92708: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204109.92723: variable 'omit' from source: magic vars 12755 1727204109.92727: starting attempt loop 12755 1727204109.92730: running the handler 12755 1727204109.92777: _low_level_execute_command(): starting 12755 1727204109.92784: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204109.93904: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204109.93976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204109.94038: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204109.94097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204109.96006: stdout chunk (state=3): >>>/root <<< 12755 1727204109.96067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204109.96548: stderr chunk (state=3): >>><<< 12755 1727204109.96553: stdout chunk (state=3): >>><<< 12755 1727204109.96557: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204109.96559: _low_level_execute_command(): starting 12755 1727204109.96563: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204109.9642498-15028-255719652969835 `" && echo ansible-tmp-1727204109.9642498-15028-255719652969835="` echo /root/.ansible/tmp/ansible-tmp-1727204109.9642498-15028-255719652969835 `" ) && sleep 0' 12755 1727204109.97465: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204109.97469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204109.97485: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 12755 1727204109.97491: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204109.97570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204109.97626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204109.97736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204109.99931: stdout chunk (state=3): >>>ansible-tmp-1727204109.9642498-15028-255719652969835=/root/.ansible/tmp/ansible-tmp-1727204109.9642498-15028-255719652969835 <<< 12755 1727204110.00295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204110.00299: stdout chunk (state=3): >>><<< 12755 1727204110.00302: stderr chunk (state=3): >>><<< 12755 1727204110.00305: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204109.9642498-15028-255719652969835=/root/.ansible/tmp/ansible-tmp-1727204109.9642498-15028-255719652969835 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204110.00307: variable 'ansible_module_compression' from source: unknown 12755 1727204110.00432: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 12755 1727204110.00795: variable 'ansible_facts' from source: unknown 12755 1727204110.01059: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204109.9642498-15028-255719652969835/AnsiballZ_package_facts.py 12755 1727204110.01523: Sending initial data 12755 1727204110.01554: Sent initial data (162 bytes) 12755 1727204110.02908: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204110.02946: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204110.02956: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204110.02972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204110.03070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204110.04952: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204110.05395: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204110.05400: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpuygjj9pn /root/.ansible/tmp/ansible-tmp-1727204109.9642498-15028-255719652969835/AnsiballZ_package_facts.py <<< 12755 1727204110.05403: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204109.9642498-15028-255719652969835/AnsiballZ_package_facts.py" <<< 12755 1727204110.05406: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpuygjj9pn" to remote "/root/.ansible/tmp/ansible-tmp-1727204109.9642498-15028-255719652969835/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204109.9642498-15028-255719652969835/AnsiballZ_package_facts.py" <<< 12755 1727204110.10283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204110.10385: stdout chunk (state=3): >>><<< 12755 1727204110.10407: stderr chunk (state=3): >>><<< 12755 1727204110.10439: done transferring module to remote 12755 1727204110.10463: _low_level_execute_command(): starting 12755 1727204110.10474: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204109.9642498-15028-255719652969835/ /root/.ansible/tmp/ansible-tmp-1727204109.9642498-15028-255719652969835/AnsiballZ_package_facts.py && sleep 0' 12755 1727204110.11735: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204110.11930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204110.12123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204110.14198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204110.14302: stderr chunk (state=3): >>><<< 12755 1727204110.14312: stdout chunk (state=3): >>><<< 12755 1727204110.14341: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204110.14359: _low_level_execute_command(): starting 12755 1727204110.14371: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204109.9642498-15028-255719652969835/AnsiballZ_package_facts.py && sleep 0' 12755 1727204110.15058: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204110.15073: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204110.15088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204110.15112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204110.15152: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204110.15169: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12755 1727204110.15265: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204110.15292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204110.15311: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204110.15338: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204110.15437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204110.81751: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 12755 1727204110.81873: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": <<< 12755 1727204110.81886: stdout chunk (state=3): >>>"rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "relea<<< 12755 1727204110.81912: stdout chunk (state=3): >>>se": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 12755 1727204110.81937: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils",<<< 12755 1727204110.81953: stdout chunk (state=3): >>> "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb",<<< 12755 1727204110.81963: stdout chunk (state=3): >>> "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.<<< 12755 1727204110.82020: stdout chunk (state=3): >>>fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": n<<< 12755 1727204110.82049: stdout chunk (state=3): >>>ull, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5<<< 12755 1727204110.82068: stdout chunk (state=3): >>>", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", <<< 12755 1727204110.82080: stdout chunk (state=3): >>>"source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name"<<< 12755 1727204110.82107: stdout chunk (state=3): >>>: "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch<<< 12755 1727204110.82125: stdout chunk (state=3): >>>": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_<<< 12755 1727204110.82142: stdout chunk (state=3): >>>64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": nul<<< 12755 1727204110.82153: stdout chunk (state=3): >>>l, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 12755 1727204110.84235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204110.84298: stderr chunk (state=3): >>><<< 12755 1727204110.84303: stdout chunk (state=3): >>><<< 12755 1727204110.84353: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204110.86624: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204109.9642498-15028-255719652969835/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204110.86646: _low_level_execute_command(): starting 12755 1727204110.86649: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204109.9642498-15028-255719652969835/ > /dev/null 2>&1 && sleep 0' 12755 1727204110.87156: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204110.87159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204110.87162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204110.87164: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 12755 1727204110.87166: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204110.87169: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204110.87214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204110.87231: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204110.87284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204110.89377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204110.89432: stderr chunk (state=3): >>><<< 12755 1727204110.89437: stdout chunk (state=3): >>><<< 12755 1727204110.89452: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204110.89458: handler run complete 12755 1727204110.90281: variable 'ansible_facts' from source: unknown 12755 1727204110.90992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204110.93091: variable 'ansible_facts' from source: unknown 12755 1727204110.93815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204110.95297: attempt loop complete, returning result 12755 1727204110.95301: _execute() done 12755 1727204110.95303: dumping result to json 12755 1727204110.95598: done dumping result, returning 12755 1727204110.95614: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-72e9-1a19-0000000006cb] 12755 1727204110.95617: sending task result for task 12b410aa-8751-72e9-1a19-0000000006cb ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204110.99467: done sending task result for task 12b410aa-8751-72e9-1a19-0000000006cb 12755 1727204110.99472: WORKER PROCESS EXITING 12755 1727204110.99484: no more pending results, returning what we have 12755 1727204110.99486: results queue empty 12755 1727204110.99487: checking for any_errors_fatal 12755 1727204110.99493: done checking for any_errors_fatal 12755 1727204110.99494: checking for max_fail_percentage 12755 1727204110.99495: done checking for max_fail_percentage 12755 1727204110.99496: checking to see if all hosts have failed and the running result is not ok 12755 1727204110.99496: done checking to see if all hosts have failed 12755 1727204110.99497: getting the remaining hosts for this loop 12755 1727204110.99498: done getting the remaining hosts for this loop 12755 1727204110.99501: getting the next task for host managed-node1 12755 1727204110.99507: done getting next task for host managed-node1 12755 1727204110.99510: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12755 1727204110.99512: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204110.99522: getting variables 12755 1727204110.99523: in VariableManager get_vars() 12755 1727204110.99559: Calling all_inventory to load vars for managed-node1 12755 1727204110.99561: Calling groups_inventory to load vars for managed-node1 12755 1727204110.99563: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204110.99572: Calling all_plugins_play to load vars for managed-node1 12755 1727204110.99576: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204110.99580: Calling groups_plugins_play to load vars for managed-node1 12755 1727204111.00705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204111.03449: done with get_vars() 12755 1727204111.03503: done getting variables 12755 1727204111.03589: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:55:11 -0400 (0:00:01.137) 0:00:36.272 ***** 12755 1727204111.03651: entering _queue_task() for managed-node1/debug 12755 1727204111.04223: worker is 1 (out of 1 available) 12755 1727204111.04235: exiting _queue_task() for managed-node1/debug 12755 1727204111.04250: done queuing things up, now waiting for results queue to drain 12755 1727204111.04251: waiting for pending results... 12755 1727204111.04496: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 12755 1727204111.04616: in run() - task 12b410aa-8751-72e9-1a19-00000000007c 12755 1727204111.04643: variable 'ansible_search_path' from source: unknown 12755 1727204111.04699: variable 'ansible_search_path' from source: unknown 12755 1727204111.04703: calling self._execute() 12755 1727204111.04835: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204111.04849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204111.04865: variable 'omit' from source: magic vars 12755 1727204111.05365: variable 'ansible_distribution_major_version' from source: facts 12755 1727204111.05392: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204111.05405: variable 'omit' from source: magic vars 12755 1727204111.05574: variable 'omit' from source: magic vars 12755 1727204111.05638: variable 'network_provider' from source: set_fact 12755 1727204111.05664: variable 'omit' from source: magic vars 12755 1727204111.05731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204111.05776: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204111.05815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204111.05844: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204111.05864: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204111.05916: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204111.05929: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204111.05938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204111.06078: Set connection var ansible_connection to ssh 12755 1727204111.06097: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204111.06106: Set connection var ansible_shell_type to sh 12755 1727204111.06229: Set connection var ansible_timeout to 10 12755 1727204111.06234: Set connection var ansible_shell_executable to /bin/sh 12755 1727204111.06237: Set connection var ansible_pipelining to False 12755 1727204111.06239: variable 'ansible_shell_executable' from source: unknown 12755 1727204111.06241: variable 'ansible_connection' from source: unknown 12755 1727204111.06243: variable 'ansible_module_compression' from source: unknown 12755 1727204111.06245: variable 'ansible_shell_type' from source: unknown 12755 1727204111.06247: variable 'ansible_shell_executable' from source: unknown 12755 1727204111.06249: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204111.06251: variable 'ansible_pipelining' from source: unknown 12755 1727204111.06253: variable 'ansible_timeout' from source: unknown 12755 1727204111.06255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204111.06443: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204111.06463: variable 'omit' from source: magic vars 12755 1727204111.06479: starting attempt loop 12755 1727204111.06487: running the handler 12755 1727204111.06553: handler run complete 12755 1727204111.06577: attempt loop complete, returning result 12755 1727204111.06589: _execute() done 12755 1727204111.06664: dumping result to json 12755 1727204111.06668: done dumping result, returning 12755 1727204111.06671: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-72e9-1a19-00000000007c] 12755 1727204111.06673: sending task result for task 12b410aa-8751-72e9-1a19-00000000007c 12755 1727204111.06753: done sending task result for task 12b410aa-8751-72e9-1a19-00000000007c 12755 1727204111.06756: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: Using network provider: nm 12755 1727204111.06845: no more pending results, returning what we have 12755 1727204111.06849: results queue empty 12755 1727204111.06850: checking for any_errors_fatal 12755 1727204111.06864: done checking for any_errors_fatal 12755 1727204111.06865: checking for max_fail_percentage 12755 1727204111.06867: done checking for max_fail_percentage 12755 1727204111.06868: checking to see if all hosts have failed and the running result is not ok 12755 1727204111.06870: done checking to see if all hosts have failed 12755 1727204111.06870: getting the remaining hosts for this loop 12755 1727204111.06986: done getting the remaining hosts for this loop 12755 1727204111.06998: getting the next task for host managed-node1 12755 1727204111.07006: done getting next task for host managed-node1 12755 1727204111.07011: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12755 1727204111.07016: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204111.07033: getting variables 12755 1727204111.07036: in VariableManager get_vars() 12755 1727204111.07150: Calling all_inventory to load vars for managed-node1 12755 1727204111.07155: Calling groups_inventory to load vars for managed-node1 12755 1727204111.07158: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204111.07171: Calling all_plugins_play to load vars for managed-node1 12755 1727204111.07175: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204111.07179: Calling groups_plugins_play to load vars for managed-node1 12755 1727204111.09856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204111.13079: done with get_vars() 12755 1727204111.13125: done getting variables 12755 1727204111.13196: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:55:11 -0400 (0:00:00.095) 0:00:36.368 ***** 12755 1727204111.13232: entering _queue_task() for managed-node1/fail 12755 1727204111.13594: worker is 1 (out of 1 available) 12755 1727204111.13611: exiting _queue_task() for managed-node1/fail 12755 1727204111.13628: done queuing things up, now waiting for results queue to drain 12755 1727204111.13630: waiting for pending results... 12755 1727204111.14422: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12755 1727204111.14428: in run() - task 12b410aa-8751-72e9-1a19-00000000007d 12755 1727204111.14432: variable 'ansible_search_path' from source: unknown 12755 1727204111.14435: variable 'ansible_search_path' from source: unknown 12755 1727204111.14438: calling self._execute() 12755 1727204111.14442: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204111.14444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204111.14447: variable 'omit' from source: magic vars 12755 1727204111.14883: variable 'ansible_distribution_major_version' from source: facts 12755 1727204111.14912: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204111.15096: variable 'network_state' from source: role '' defaults 12755 1727204111.15100: Evaluated conditional (network_state != {}): False 12755 1727204111.15107: when evaluation is False, skipping this task 12755 1727204111.15194: _execute() done 12755 1727204111.15199: dumping result to json 12755 1727204111.15201: done dumping result, returning 12755 1727204111.15204: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-72e9-1a19-00000000007d] 12755 1727204111.15208: sending task result for task 12b410aa-8751-72e9-1a19-00000000007d 12755 1727204111.15397: done sending task result for task 12b410aa-8751-72e9-1a19-00000000007d 12755 1727204111.15401: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204111.15460: no more pending results, returning what we have 12755 1727204111.15465: results queue empty 12755 1727204111.15466: checking for any_errors_fatal 12755 1727204111.15478: done checking for any_errors_fatal 12755 1727204111.15479: checking for max_fail_percentage 12755 1727204111.15481: done checking for max_fail_percentage 12755 1727204111.15482: checking to see if all hosts have failed and the running result is not ok 12755 1727204111.15483: done checking to see if all hosts have failed 12755 1727204111.15484: getting the remaining hosts for this loop 12755 1727204111.15486: done getting the remaining hosts for this loop 12755 1727204111.15494: getting the next task for host managed-node1 12755 1727204111.15503: done getting next task for host managed-node1 12755 1727204111.15507: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12755 1727204111.15512: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204111.15536: getting variables 12755 1727204111.15538: in VariableManager get_vars() 12755 1727204111.15706: Calling all_inventory to load vars for managed-node1 12755 1727204111.15710: Calling groups_inventory to load vars for managed-node1 12755 1727204111.15713: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204111.15726: Calling all_plugins_play to load vars for managed-node1 12755 1727204111.15731: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204111.15735: Calling groups_plugins_play to load vars for managed-node1 12755 1727204111.19400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204111.23708: done with get_vars() 12755 1727204111.23765: done getting variables 12755 1727204111.23832: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:55:11 -0400 (0:00:00.106) 0:00:36.474 ***** 12755 1727204111.23873: entering _queue_task() for managed-node1/fail 12755 1727204111.24254: worker is 1 (out of 1 available) 12755 1727204111.24268: exiting _queue_task() for managed-node1/fail 12755 1727204111.24284: done queuing things up, now waiting for results queue to drain 12755 1727204111.24286: waiting for pending results... 12755 1727204111.24697: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12755 1727204111.24774: in run() - task 12b410aa-8751-72e9-1a19-00000000007e 12755 1727204111.24801: variable 'ansible_search_path' from source: unknown 12755 1727204111.24816: variable 'ansible_search_path' from source: unknown 12755 1727204111.24895: calling self._execute() 12755 1727204111.24980: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204111.24995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204111.25013: variable 'omit' from source: magic vars 12755 1727204111.25463: variable 'ansible_distribution_major_version' from source: facts 12755 1727204111.25696: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204111.25699: variable 'network_state' from source: role '' defaults 12755 1727204111.25702: Evaluated conditional (network_state != {}): False 12755 1727204111.25705: when evaluation is False, skipping this task 12755 1727204111.25708: _execute() done 12755 1727204111.25710: dumping result to json 12755 1727204111.25713: done dumping result, returning 12755 1727204111.25715: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-72e9-1a19-00000000007e] 12755 1727204111.25718: sending task result for task 12b410aa-8751-72e9-1a19-00000000007e 12755 1727204111.25823: done sending task result for task 12b410aa-8751-72e9-1a19-00000000007e skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204111.25882: no more pending results, returning what we have 12755 1727204111.25887: results queue empty 12755 1727204111.25888: checking for any_errors_fatal 12755 1727204111.25903: done checking for any_errors_fatal 12755 1727204111.25904: checking for max_fail_percentage 12755 1727204111.25906: done checking for max_fail_percentage 12755 1727204111.25907: checking to see if all hosts have failed and the running result is not ok 12755 1727204111.25909: done checking to see if all hosts have failed 12755 1727204111.25910: getting the remaining hosts for this loop 12755 1727204111.25912: done getting the remaining hosts for this loop 12755 1727204111.25918: getting the next task for host managed-node1 12755 1727204111.25927: done getting next task for host managed-node1 12755 1727204111.25931: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12755 1727204111.25936: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204111.25962: getting variables 12755 1727204111.25965: in VariableManager get_vars() 12755 1727204111.26035: Calling all_inventory to load vars for managed-node1 12755 1727204111.26039: Calling groups_inventory to load vars for managed-node1 12755 1727204111.26042: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204111.26058: Calling all_plugins_play to load vars for managed-node1 12755 1727204111.26062: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204111.26066: Calling groups_plugins_play to load vars for managed-node1 12755 1727204111.26906: WORKER PROCESS EXITING 12755 1727204111.28504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204111.31415: done with get_vars() 12755 1727204111.31466: done getting variables 12755 1727204111.31542: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:55:11 -0400 (0:00:00.077) 0:00:36.551 ***** 12755 1727204111.31587: entering _queue_task() for managed-node1/fail 12755 1727204111.31970: worker is 1 (out of 1 available) 12755 1727204111.31987: exiting _queue_task() for managed-node1/fail 12755 1727204111.32101: done queuing things up, now waiting for results queue to drain 12755 1727204111.32103: waiting for pending results... 12755 1727204111.32331: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12755 1727204111.32506: in run() - task 12b410aa-8751-72e9-1a19-00000000007f 12755 1727204111.32529: variable 'ansible_search_path' from source: unknown 12755 1727204111.32538: variable 'ansible_search_path' from source: unknown 12755 1727204111.32586: calling self._execute() 12755 1727204111.32708: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204111.32723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204111.32741: variable 'omit' from source: magic vars 12755 1727204111.33198: variable 'ansible_distribution_major_version' from source: facts 12755 1727204111.33222: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204111.33458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204111.36527: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204111.36616: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204111.36665: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204111.36716: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204111.36754: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204111.36856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204111.36901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204111.36943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204111.37002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204111.37035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204111.37158: variable 'ansible_distribution_major_version' from source: facts 12755 1727204111.37181: Evaluated conditional (ansible_distribution_major_version | int > 9): True 12755 1727204111.37347: variable 'ansible_distribution' from source: facts 12755 1727204111.37358: variable '__network_rh_distros' from source: role '' defaults 12755 1727204111.37372: Evaluated conditional (ansible_distribution in __network_rh_distros): False 12755 1727204111.37380: when evaluation is False, skipping this task 12755 1727204111.37388: _execute() done 12755 1727204111.37398: dumping result to json 12755 1727204111.37494: done dumping result, returning 12755 1727204111.37498: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-72e9-1a19-00000000007f] 12755 1727204111.37501: sending task result for task 12b410aa-8751-72e9-1a19-00000000007f 12755 1727204111.37578: done sending task result for task 12b410aa-8751-72e9-1a19-00000000007f 12755 1727204111.37581: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 12755 1727204111.37639: no more pending results, returning what we have 12755 1727204111.37643: results queue empty 12755 1727204111.37645: checking for any_errors_fatal 12755 1727204111.37652: done checking for any_errors_fatal 12755 1727204111.37653: checking for max_fail_percentage 12755 1727204111.37655: done checking for max_fail_percentage 12755 1727204111.37656: checking to see if all hosts have failed and the running result is not ok 12755 1727204111.37657: done checking to see if all hosts have failed 12755 1727204111.37658: getting the remaining hosts for this loop 12755 1727204111.37661: done getting the remaining hosts for this loop 12755 1727204111.37666: getting the next task for host managed-node1 12755 1727204111.37674: done getting next task for host managed-node1 12755 1727204111.37680: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12755 1727204111.37684: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204111.37708: getting variables 12755 1727204111.37710: in VariableManager get_vars() 12755 1727204111.37777: Calling all_inventory to load vars for managed-node1 12755 1727204111.37781: Calling groups_inventory to load vars for managed-node1 12755 1727204111.37784: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204111.38001: Calling all_plugins_play to load vars for managed-node1 12755 1727204111.38006: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204111.38010: Calling groups_plugins_play to load vars for managed-node1 12755 1727204111.40402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204111.43331: done with get_vars() 12755 1727204111.43370: done getting variables 12755 1727204111.43450: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:55:11 -0400 (0:00:00.119) 0:00:36.670 ***** 12755 1727204111.43493: entering _queue_task() for managed-node1/dnf 12755 1727204111.43855: worker is 1 (out of 1 available) 12755 1727204111.43870: exiting _queue_task() for managed-node1/dnf 12755 1727204111.43885: done queuing things up, now waiting for results queue to drain 12755 1727204111.43887: waiting for pending results... 12755 1727204111.44311: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12755 1727204111.44375: in run() - task 12b410aa-8751-72e9-1a19-000000000080 12755 1727204111.44400: variable 'ansible_search_path' from source: unknown 12755 1727204111.44409: variable 'ansible_search_path' from source: unknown 12755 1727204111.44457: calling self._execute() 12755 1727204111.44647: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204111.44650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204111.44654: variable 'omit' from source: magic vars 12755 1727204111.45054: variable 'ansible_distribution_major_version' from source: facts 12755 1727204111.45074: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204111.45348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204111.48096: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204111.48100: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204111.48103: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204111.48146: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204111.48183: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204111.48291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204111.48341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204111.48382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204111.48446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204111.48470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204111.48624: variable 'ansible_distribution' from source: facts 12755 1727204111.48637: variable 'ansible_distribution_major_version' from source: facts 12755 1727204111.48656: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 12755 1727204111.48813: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204111.49009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204111.49090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204111.49093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204111.49140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204111.49161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204111.49219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204111.49251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204111.49283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204111.49341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204111.49360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204111.49494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204111.49498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204111.49501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204111.49536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204111.49559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204111.49775: variable 'network_connections' from source: task vars 12755 1727204111.49795: variable 'controller_profile' from source: play vars 12755 1727204111.49880: variable 'controller_profile' from source: play vars 12755 1727204111.49973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204111.50210: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204111.50261: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204111.50311: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204111.50352: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204111.50416: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204111.50447: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204111.50608: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204111.50611: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204111.50614: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204111.50944: variable 'network_connections' from source: task vars 12755 1727204111.50957: variable 'controller_profile' from source: play vars 12755 1727204111.51033: variable 'controller_profile' from source: play vars 12755 1727204111.51072: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12755 1727204111.51082: when evaluation is False, skipping this task 12755 1727204111.51092: _execute() done 12755 1727204111.51101: dumping result to json 12755 1727204111.51110: done dumping result, returning 12755 1727204111.51124: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-72e9-1a19-000000000080] 12755 1727204111.51136: sending task result for task 12b410aa-8751-72e9-1a19-000000000080 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12755 1727204111.51324: no more pending results, returning what we have 12755 1727204111.51328: results queue empty 12755 1727204111.51330: checking for any_errors_fatal 12755 1727204111.51338: done checking for any_errors_fatal 12755 1727204111.51340: checking for max_fail_percentage 12755 1727204111.51342: done checking for max_fail_percentage 12755 1727204111.51342: checking to see if all hosts have failed and the running result is not ok 12755 1727204111.51344: done checking to see if all hosts have failed 12755 1727204111.51345: getting the remaining hosts for this loop 12755 1727204111.51348: done getting the remaining hosts for this loop 12755 1727204111.51353: getting the next task for host managed-node1 12755 1727204111.51363: done getting next task for host managed-node1 12755 1727204111.51368: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12755 1727204111.51372: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204111.51698: getting variables 12755 1727204111.51700: in VariableManager get_vars() 12755 1727204111.51759: Calling all_inventory to load vars for managed-node1 12755 1727204111.51762: Calling groups_inventory to load vars for managed-node1 12755 1727204111.51766: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204111.51777: Calling all_plugins_play to load vars for managed-node1 12755 1727204111.51782: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204111.51786: Calling groups_plugins_play to load vars for managed-node1 12755 1727204111.51801: done sending task result for task 12b410aa-8751-72e9-1a19-000000000080 12755 1727204111.51804: WORKER PROCESS EXITING 12755 1727204111.54147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204111.62553: done with get_vars() 12755 1727204111.62611: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12755 1727204111.62695: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:55:11 -0400 (0:00:00.192) 0:00:36.863 ***** 12755 1727204111.62731: entering _queue_task() for managed-node1/yum 12755 1727204111.63201: worker is 1 (out of 1 available) 12755 1727204111.63215: exiting _queue_task() for managed-node1/yum 12755 1727204111.63227: done queuing things up, now waiting for results queue to drain 12755 1727204111.63230: waiting for pending results... 12755 1727204111.63468: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12755 1727204111.63659: in run() - task 12b410aa-8751-72e9-1a19-000000000081 12755 1727204111.63682: variable 'ansible_search_path' from source: unknown 12755 1727204111.63700: variable 'ansible_search_path' from source: unknown 12755 1727204111.63753: calling self._execute() 12755 1727204111.63877: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204111.63895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204111.63917: variable 'omit' from source: magic vars 12755 1727204111.64599: variable 'ansible_distribution_major_version' from source: facts 12755 1727204111.64620: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204111.64942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204111.67604: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204111.67702: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204111.67757: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204111.67808: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204111.67849: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204111.67950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204111.67998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204111.68075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204111.68098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204111.68122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204111.68238: variable 'ansible_distribution_major_version' from source: facts 12755 1727204111.68262: Evaluated conditional (ansible_distribution_major_version | int < 8): False 12755 1727204111.68271: when evaluation is False, skipping this task 12755 1727204111.68292: _execute() done 12755 1727204111.68295: dumping result to json 12755 1727204111.68397: done dumping result, returning 12755 1727204111.68400: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-72e9-1a19-000000000081] 12755 1727204111.68403: sending task result for task 12b410aa-8751-72e9-1a19-000000000081 12755 1727204111.68474: done sending task result for task 12b410aa-8751-72e9-1a19-000000000081 12755 1727204111.68476: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 12755 1727204111.68568: no more pending results, returning what we have 12755 1727204111.68572: results queue empty 12755 1727204111.68573: checking for any_errors_fatal 12755 1727204111.68582: done checking for any_errors_fatal 12755 1727204111.68584: checking for max_fail_percentage 12755 1727204111.68586: done checking for max_fail_percentage 12755 1727204111.68586: checking to see if all hosts have failed and the running result is not ok 12755 1727204111.68588: done checking to see if all hosts have failed 12755 1727204111.68790: getting the remaining hosts for this loop 12755 1727204111.68794: done getting the remaining hosts for this loop 12755 1727204111.68799: getting the next task for host managed-node1 12755 1727204111.68805: done getting next task for host managed-node1 12755 1727204111.68809: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12755 1727204111.68812: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204111.68832: getting variables 12755 1727204111.68834: in VariableManager get_vars() 12755 1727204111.69095: Calling all_inventory to load vars for managed-node1 12755 1727204111.69100: Calling groups_inventory to load vars for managed-node1 12755 1727204111.69103: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204111.69115: Calling all_plugins_play to load vars for managed-node1 12755 1727204111.69119: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204111.69123: Calling groups_plugins_play to load vars for managed-node1 12755 1727204111.71339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204111.75012: done with get_vars() 12755 1727204111.75056: done getting variables 12755 1727204111.75132: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:55:11 -0400 (0:00:00.124) 0:00:36.987 ***** 12755 1727204111.75175: entering _queue_task() for managed-node1/fail 12755 1727204111.75559: worker is 1 (out of 1 available) 12755 1727204111.75665: exiting _queue_task() for managed-node1/fail 12755 1727204111.75679: done queuing things up, now waiting for results queue to drain 12755 1727204111.75680: waiting for pending results... 12755 1727204111.76312: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12755 1727204111.76436: in run() - task 12b410aa-8751-72e9-1a19-000000000082 12755 1727204111.76536: variable 'ansible_search_path' from source: unknown 12755 1727204111.76547: variable 'ansible_search_path' from source: unknown 12755 1727204111.76597: calling self._execute() 12755 1727204111.76902: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204111.76917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204111.76933: variable 'omit' from source: magic vars 12755 1727204111.77539: variable 'ansible_distribution_major_version' from source: facts 12755 1727204111.77563: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204111.77727: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204111.78010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204111.81213: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204111.81303: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204111.81367: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204111.81418: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204111.81455: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204111.81558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204111.81607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204111.81644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204111.81707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204111.81797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204111.81801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204111.81829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204111.81867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204111.81928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204111.81948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204111.82011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204111.82049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204111.82085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204111.82146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204111.82165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204111.82399: variable 'network_connections' from source: task vars 12755 1727204111.82453: variable 'controller_profile' from source: play vars 12755 1727204111.82511: variable 'controller_profile' from source: play vars 12755 1727204111.82621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204111.82836: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204111.82890: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204111.82996: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204111.83000: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204111.83042: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204111.83074: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204111.83116: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204111.83155: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204111.83222: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204111.83558: variable 'network_connections' from source: task vars 12755 1727204111.83570: variable 'controller_profile' from source: play vars 12755 1727204111.83651: variable 'controller_profile' from source: play vars 12755 1727204111.83685: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12755 1727204111.83758: when evaluation is False, skipping this task 12755 1727204111.83761: _execute() done 12755 1727204111.83764: dumping result to json 12755 1727204111.83767: done dumping result, returning 12755 1727204111.83769: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-72e9-1a19-000000000082] 12755 1727204111.83772: sending task result for task 12b410aa-8751-72e9-1a19-000000000082 12755 1727204111.84103: done sending task result for task 12b410aa-8751-72e9-1a19-000000000082 12755 1727204111.84106: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12755 1727204111.84155: no more pending results, returning what we have 12755 1727204111.84158: results queue empty 12755 1727204111.84160: checking for any_errors_fatal 12755 1727204111.84165: done checking for any_errors_fatal 12755 1727204111.84166: checking for max_fail_percentage 12755 1727204111.84168: done checking for max_fail_percentage 12755 1727204111.84169: checking to see if all hosts have failed and the running result is not ok 12755 1727204111.84170: done checking to see if all hosts have failed 12755 1727204111.84171: getting the remaining hosts for this loop 12755 1727204111.84172: done getting the remaining hosts for this loop 12755 1727204111.84176: getting the next task for host managed-node1 12755 1727204111.84183: done getting next task for host managed-node1 12755 1727204111.84187: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12755 1727204111.84193: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204111.84213: getting variables 12755 1727204111.84215: in VariableManager get_vars() 12755 1727204111.84273: Calling all_inventory to load vars for managed-node1 12755 1727204111.84276: Calling groups_inventory to load vars for managed-node1 12755 1727204111.84279: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204111.84303: Calling all_plugins_play to load vars for managed-node1 12755 1727204111.84311: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204111.84316: Calling groups_plugins_play to load vars for managed-node1 12755 1727204111.87925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204111.91383: done with get_vars() 12755 1727204111.91494: done getting variables 12755 1727204111.91576: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:55:11 -0400 (0:00:00.164) 0:00:37.151 ***** 12755 1727204111.91626: entering _queue_task() for managed-node1/package 12755 1727204111.92115: worker is 1 (out of 1 available) 12755 1727204111.92132: exiting _queue_task() for managed-node1/package 12755 1727204111.92146: done queuing things up, now waiting for results queue to drain 12755 1727204111.92147: waiting for pending results... 12755 1727204111.92355: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 12755 1727204111.92542: in run() - task 12b410aa-8751-72e9-1a19-000000000083 12755 1727204111.92573: variable 'ansible_search_path' from source: unknown 12755 1727204111.92585: variable 'ansible_search_path' from source: unknown 12755 1727204111.92636: calling self._execute() 12755 1727204111.92763: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204111.92778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204111.92804: variable 'omit' from source: magic vars 12755 1727204111.93285: variable 'ansible_distribution_major_version' from source: facts 12755 1727204111.93307: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204111.93584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204111.93997: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204111.94002: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204111.94026: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204111.94115: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204111.94261: variable 'network_packages' from source: role '' defaults 12755 1727204111.94408: variable '__network_provider_setup' from source: role '' defaults 12755 1727204111.94433: variable '__network_service_name_default_nm' from source: role '' defaults 12755 1727204111.94515: variable '__network_service_name_default_nm' from source: role '' defaults 12755 1727204111.94531: variable '__network_packages_default_nm' from source: role '' defaults 12755 1727204111.94615: variable '__network_packages_default_nm' from source: role '' defaults 12755 1727204111.94892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204111.97231: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204111.97322: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204111.97479: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204111.97483: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204111.97486: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204111.97565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204111.97612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204111.97650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204111.97708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204111.97731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204111.97791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204111.97835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204111.97870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204111.97930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204111.97954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204111.98275: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12755 1727204111.98431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204111.98470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204111.98508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204111.98562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204111.98794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204111.98798: variable 'ansible_python' from source: facts 12755 1727204111.98801: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12755 1727204111.98853: variable '__network_wpa_supplicant_required' from source: role '' defaults 12755 1727204111.98963: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12755 1727204111.99145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204111.99181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204111.99221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204111.99279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204111.99305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204111.99372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204111.99419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204111.99459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204111.99516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204111.99539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204111.99734: variable 'network_connections' from source: task vars 12755 1727204111.99748: variable 'controller_profile' from source: play vars 12755 1727204111.99872: variable 'controller_profile' from source: play vars 12755 1727204111.99964: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204112.00011: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204112.00053: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204112.00098: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204112.00162: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204112.00855: variable 'network_connections' from source: task vars 12755 1727204112.00937: variable 'controller_profile' from source: play vars 12755 1727204112.00981: variable 'controller_profile' from source: play vars 12755 1727204112.01028: variable '__network_packages_default_wireless' from source: role '' defaults 12755 1727204112.01131: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204112.01527: variable 'network_connections' from source: task vars 12755 1727204112.01538: variable 'controller_profile' from source: play vars 12755 1727204112.01621: variable 'controller_profile' from source: play vars 12755 1727204112.01651: variable '__network_packages_default_team' from source: role '' defaults 12755 1727204112.01753: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204112.02161: variable 'network_connections' from source: task vars 12755 1727204112.02173: variable 'controller_profile' from source: play vars 12755 1727204112.02256: variable 'controller_profile' from source: play vars 12755 1727204112.02328: variable '__network_service_name_default_initscripts' from source: role '' defaults 12755 1727204112.02409: variable '__network_service_name_default_initscripts' from source: role '' defaults 12755 1727204112.02499: variable '__network_packages_default_initscripts' from source: role '' defaults 12755 1727204112.02504: variable '__network_packages_default_initscripts' from source: role '' defaults 12755 1727204112.02802: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12755 1727204112.03439: variable 'network_connections' from source: task vars 12755 1727204112.03451: variable 'controller_profile' from source: play vars 12755 1727204112.03528: variable 'controller_profile' from source: play vars 12755 1727204112.03542: variable 'ansible_distribution' from source: facts 12755 1727204112.03552: variable '__network_rh_distros' from source: role '' defaults 12755 1727204112.03562: variable 'ansible_distribution_major_version' from source: facts 12755 1727204112.03583: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12755 1727204112.03804: variable 'ansible_distribution' from source: facts 12755 1727204112.03814: variable '__network_rh_distros' from source: role '' defaults 12755 1727204112.03825: variable 'ansible_distribution_major_version' from source: facts 12755 1727204112.03836: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12755 1727204112.04063: variable 'ansible_distribution' from source: facts 12755 1727204112.04294: variable '__network_rh_distros' from source: role '' defaults 12755 1727204112.04297: variable 'ansible_distribution_major_version' from source: facts 12755 1727204112.04300: variable 'network_provider' from source: set_fact 12755 1727204112.04302: variable 'ansible_facts' from source: unknown 12755 1727204112.05421: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 12755 1727204112.05431: when evaluation is False, skipping this task 12755 1727204112.05439: _execute() done 12755 1727204112.05447: dumping result to json 12755 1727204112.05455: done dumping result, returning 12755 1727204112.05467: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-72e9-1a19-000000000083] 12755 1727204112.05478: sending task result for task 12b410aa-8751-72e9-1a19-000000000083 skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 12755 1727204112.05655: no more pending results, returning what we have 12755 1727204112.05659: results queue empty 12755 1727204112.05661: checking for any_errors_fatal 12755 1727204112.05669: done checking for any_errors_fatal 12755 1727204112.05670: checking for max_fail_percentage 12755 1727204112.05672: done checking for max_fail_percentage 12755 1727204112.05672: checking to see if all hosts have failed and the running result is not ok 12755 1727204112.05674: done checking to see if all hosts have failed 12755 1727204112.05674: getting the remaining hosts for this loop 12755 1727204112.05676: done getting the remaining hosts for this loop 12755 1727204112.05681: getting the next task for host managed-node1 12755 1727204112.05690: done getting next task for host managed-node1 12755 1727204112.05694: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12755 1727204112.05697: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204112.05725: getting variables 12755 1727204112.05727: in VariableManager get_vars() 12755 1727204112.05788: Calling all_inventory to load vars for managed-node1 12755 1727204112.05996: Calling groups_inventory to load vars for managed-node1 12755 1727204112.06001: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204112.06008: done sending task result for task 12b410aa-8751-72e9-1a19-000000000083 12755 1727204112.06011: WORKER PROCESS EXITING 12755 1727204112.06022: Calling all_plugins_play to load vars for managed-node1 12755 1727204112.06026: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204112.06031: Calling groups_plugins_play to load vars for managed-node1 12755 1727204112.08121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204112.11126: done with get_vars() 12755 1727204112.11162: done getting variables 12755 1727204112.11231: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:55:12 -0400 (0:00:00.196) 0:00:37.348 ***** 12755 1727204112.11269: entering _queue_task() for managed-node1/package 12755 1727204112.11620: worker is 1 (out of 1 available) 12755 1727204112.11635: exiting _queue_task() for managed-node1/package 12755 1727204112.11651: done queuing things up, now waiting for results queue to drain 12755 1727204112.11652: waiting for pending results... 12755 1727204112.12509: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12755 1727204112.12848: in run() - task 12b410aa-8751-72e9-1a19-000000000084 12755 1727204112.12853: variable 'ansible_search_path' from source: unknown 12755 1727204112.12856: variable 'ansible_search_path' from source: unknown 12755 1727204112.12879: calling self._execute() 12755 1727204112.13201: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204112.13216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204112.13234: variable 'omit' from source: magic vars 12755 1727204112.13979: variable 'ansible_distribution_major_version' from source: facts 12755 1727204112.14001: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204112.14165: variable 'network_state' from source: role '' defaults 12755 1727204112.14185: Evaluated conditional (network_state != {}): False 12755 1727204112.14196: when evaluation is False, skipping this task 12755 1727204112.14204: _execute() done 12755 1727204112.14213: dumping result to json 12755 1727204112.14222: done dumping result, returning 12755 1727204112.14237: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-72e9-1a19-000000000084] 12755 1727204112.14250: sending task result for task 12b410aa-8751-72e9-1a19-000000000084 12755 1727204112.14603: done sending task result for task 12b410aa-8751-72e9-1a19-000000000084 12755 1727204112.14607: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204112.14653: no more pending results, returning what we have 12755 1727204112.14657: results queue empty 12755 1727204112.14658: checking for any_errors_fatal 12755 1727204112.14664: done checking for any_errors_fatal 12755 1727204112.14665: checking for max_fail_percentage 12755 1727204112.14667: done checking for max_fail_percentage 12755 1727204112.14668: checking to see if all hosts have failed and the running result is not ok 12755 1727204112.14669: done checking to see if all hosts have failed 12755 1727204112.14670: getting the remaining hosts for this loop 12755 1727204112.14672: done getting the remaining hosts for this loop 12755 1727204112.14676: getting the next task for host managed-node1 12755 1727204112.14683: done getting next task for host managed-node1 12755 1727204112.14687: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12755 1727204112.14693: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204112.14713: getting variables 12755 1727204112.14715: in VariableManager get_vars() 12755 1727204112.14774: Calling all_inventory to load vars for managed-node1 12755 1727204112.14777: Calling groups_inventory to load vars for managed-node1 12755 1727204112.14781: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204112.14795: Calling all_plugins_play to load vars for managed-node1 12755 1727204112.14800: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204112.14804: Calling groups_plugins_play to load vars for managed-node1 12755 1727204112.19311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204112.22343: done with get_vars() 12755 1727204112.22384: done getting variables 12755 1727204112.22835: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:55:12 -0400 (0:00:00.116) 0:00:37.464 ***** 12755 1727204112.22879: entering _queue_task() for managed-node1/package 12755 1727204112.23658: worker is 1 (out of 1 available) 12755 1727204112.23674: exiting _queue_task() for managed-node1/package 12755 1727204112.23692: done queuing things up, now waiting for results queue to drain 12755 1727204112.23694: waiting for pending results... 12755 1727204112.24608: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12755 1727204112.25397: in run() - task 12b410aa-8751-72e9-1a19-000000000085 12755 1727204112.25403: variable 'ansible_search_path' from source: unknown 12755 1727204112.25408: variable 'ansible_search_path' from source: unknown 12755 1727204112.25412: calling self._execute() 12755 1727204112.25955: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204112.25973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204112.26062: variable 'omit' from source: magic vars 12755 1727204112.27695: variable 'ansible_distribution_major_version' from source: facts 12755 1727204112.27699: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204112.28035: variable 'network_state' from source: role '' defaults 12755 1727204112.28039: Evaluated conditional (network_state != {}): False 12755 1727204112.28042: when evaluation is False, skipping this task 12755 1727204112.28045: _execute() done 12755 1727204112.28048: dumping result to json 12755 1727204112.28095: done dumping result, returning 12755 1727204112.28099: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-72e9-1a19-000000000085] 12755 1727204112.28102: sending task result for task 12b410aa-8751-72e9-1a19-000000000085 12755 1727204112.28451: done sending task result for task 12b410aa-8751-72e9-1a19-000000000085 12755 1727204112.28456: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204112.28514: no more pending results, returning what we have 12755 1727204112.28518: results queue empty 12755 1727204112.28519: checking for any_errors_fatal 12755 1727204112.28526: done checking for any_errors_fatal 12755 1727204112.28527: checking for max_fail_percentage 12755 1727204112.28529: done checking for max_fail_percentage 12755 1727204112.28530: checking to see if all hosts have failed and the running result is not ok 12755 1727204112.28531: done checking to see if all hosts have failed 12755 1727204112.28532: getting the remaining hosts for this loop 12755 1727204112.28533: done getting the remaining hosts for this loop 12755 1727204112.28539: getting the next task for host managed-node1 12755 1727204112.28547: done getting next task for host managed-node1 12755 1727204112.28551: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12755 1727204112.28556: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204112.28580: getting variables 12755 1727204112.28582: in VariableManager get_vars() 12755 1727204112.28648: Calling all_inventory to load vars for managed-node1 12755 1727204112.28652: Calling groups_inventory to load vars for managed-node1 12755 1727204112.28655: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204112.28670: Calling all_plugins_play to load vars for managed-node1 12755 1727204112.28674: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204112.28678: Calling groups_plugins_play to load vars for managed-node1 12755 1727204112.31526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204112.35835: done with get_vars() 12755 1727204112.35879: done getting variables 12755 1727204112.36156: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:55:12 -0400 (0:00:00.133) 0:00:37.597 ***** 12755 1727204112.36201: entering _queue_task() for managed-node1/service 12755 1727204112.36598: worker is 1 (out of 1 available) 12755 1727204112.36612: exiting _queue_task() for managed-node1/service 12755 1727204112.36626: done queuing things up, now waiting for results queue to drain 12755 1727204112.36628: waiting for pending results... 12755 1727204112.37011: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12755 1727204112.37109: in run() - task 12b410aa-8751-72e9-1a19-000000000086 12755 1727204112.37134: variable 'ansible_search_path' from source: unknown 12755 1727204112.37143: variable 'ansible_search_path' from source: unknown 12755 1727204112.37191: calling self._execute() 12755 1727204112.37319: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204112.37339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204112.37355: variable 'omit' from source: magic vars 12755 1727204112.37812: variable 'ansible_distribution_major_version' from source: facts 12755 1727204112.37874: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204112.38012: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204112.38280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204112.40861: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204112.40962: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204112.41033: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204112.41061: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204112.41099: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204112.41199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204112.41251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204112.41359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204112.41363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204112.41365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204112.41421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204112.41454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204112.41494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204112.41545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204112.41567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204112.41625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204112.41685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204112.41694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204112.41749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204112.41770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204112.42004: variable 'network_connections' from source: task vars 12755 1727204112.42194: variable 'controller_profile' from source: play vars 12755 1727204112.42198: variable 'controller_profile' from source: play vars 12755 1727204112.42219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204112.42413: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204112.42476: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204112.42519: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204112.42595: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204112.42622: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204112.42658: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204112.42693: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204112.42727: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204112.42793: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204112.43183: variable 'network_connections' from source: task vars 12755 1727204112.43186: variable 'controller_profile' from source: play vars 12755 1727204112.43232: variable 'controller_profile' from source: play vars 12755 1727204112.43267: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12755 1727204112.43278: when evaluation is False, skipping this task 12755 1727204112.43291: _execute() done 12755 1727204112.43303: dumping result to json 12755 1727204112.43311: done dumping result, returning 12755 1727204112.43325: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-72e9-1a19-000000000086] 12755 1727204112.43399: sending task result for task 12b410aa-8751-72e9-1a19-000000000086 12755 1727204112.43475: done sending task result for task 12b410aa-8751-72e9-1a19-000000000086 12755 1727204112.43484: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12755 1727204112.43562: no more pending results, returning what we have 12755 1727204112.43566: results queue empty 12755 1727204112.43568: checking for any_errors_fatal 12755 1727204112.43577: done checking for any_errors_fatal 12755 1727204112.43578: checking for max_fail_percentage 12755 1727204112.43580: done checking for max_fail_percentage 12755 1727204112.43581: checking to see if all hosts have failed and the running result is not ok 12755 1727204112.43583: done checking to see if all hosts have failed 12755 1727204112.43584: getting the remaining hosts for this loop 12755 1727204112.43586: done getting the remaining hosts for this loop 12755 1727204112.43594: getting the next task for host managed-node1 12755 1727204112.43603: done getting next task for host managed-node1 12755 1727204112.43608: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12755 1727204112.43612: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204112.43635: getting variables 12755 1727204112.43638: in VariableManager get_vars() 12755 1727204112.43813: Calling all_inventory to load vars for managed-node1 12755 1727204112.43817: Calling groups_inventory to load vars for managed-node1 12755 1727204112.43820: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204112.43834: Calling all_plugins_play to load vars for managed-node1 12755 1727204112.43837: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204112.43841: Calling groups_plugins_play to load vars for managed-node1 12755 1727204112.46218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204112.50965: done with get_vars() 12755 1727204112.51004: done getting variables 12755 1727204112.51074: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:55:12 -0400 (0:00:00.149) 0:00:37.746 ***** 12755 1727204112.51114: entering _queue_task() for managed-node1/service 12755 1727204112.51458: worker is 1 (out of 1 available) 12755 1727204112.51471: exiting _queue_task() for managed-node1/service 12755 1727204112.51485: done queuing things up, now waiting for results queue to drain 12755 1727204112.51487: waiting for pending results... 12755 1727204112.51908: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12755 1727204112.51962: in run() - task 12b410aa-8751-72e9-1a19-000000000087 12755 1727204112.51984: variable 'ansible_search_path' from source: unknown 12755 1727204112.51995: variable 'ansible_search_path' from source: unknown 12755 1727204112.52044: calling self._execute() 12755 1727204112.52165: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204112.52179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204112.52200: variable 'omit' from source: magic vars 12755 1727204112.52632: variable 'ansible_distribution_major_version' from source: facts 12755 1727204112.52653: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204112.52876: variable 'network_provider' from source: set_fact 12755 1727204112.52943: variable 'network_state' from source: role '' defaults 12755 1727204112.52947: Evaluated conditional (network_provider == "nm" or network_state != {}): True 12755 1727204112.52950: variable 'omit' from source: magic vars 12755 1727204112.53002: variable 'omit' from source: magic vars 12755 1727204112.53042: variable 'network_service_name' from source: role '' defaults 12755 1727204112.53132: variable 'network_service_name' from source: role '' defaults 12755 1727204112.53279: variable '__network_provider_setup' from source: role '' defaults 12755 1727204112.53296: variable '__network_service_name_default_nm' from source: role '' defaults 12755 1727204112.53488: variable '__network_service_name_default_nm' from source: role '' defaults 12755 1727204112.53493: variable '__network_packages_default_nm' from source: role '' defaults 12755 1727204112.53496: variable '__network_packages_default_nm' from source: role '' defaults 12755 1727204112.53768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204112.56270: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204112.56376: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204112.56429: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204112.56479: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204112.56518: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204112.56624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204112.56670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204112.56708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204112.56767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204112.56791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204112.56871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204112.56892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204112.56927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204112.57089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204112.57095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204112.57358: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12755 1727204112.57527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204112.57562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204112.57598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204112.57657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204112.57678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204112.57799: variable 'ansible_python' from source: facts 12755 1727204112.57832: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12755 1727204112.57943: variable '__network_wpa_supplicant_required' from source: role '' defaults 12755 1727204112.58056: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12755 1727204112.58238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204112.58272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204112.58313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204112.58368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204112.58494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204112.58500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204112.58511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204112.58539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204112.58597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204112.58621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204112.58806: variable 'network_connections' from source: task vars 12755 1727204112.58846: variable 'controller_profile' from source: play vars 12755 1727204112.58927: variable 'controller_profile' from source: play vars 12755 1727204112.59068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204112.59325: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204112.59595: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204112.59598: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204112.59601: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204112.59604: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204112.59730: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204112.59773: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204112.59870: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204112.59994: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204112.60766: variable 'network_connections' from source: task vars 12755 1727204112.60780: variable 'controller_profile' from source: play vars 12755 1727204112.60987: variable 'controller_profile' from source: play vars 12755 1727204112.61122: variable '__network_packages_default_wireless' from source: role '' defaults 12755 1727204112.61231: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204112.62073: variable 'network_connections' from source: task vars 12755 1727204112.62324: variable 'controller_profile' from source: play vars 12755 1727204112.62328: variable 'controller_profile' from source: play vars 12755 1727204112.62406: variable '__network_packages_default_team' from source: role '' defaults 12755 1727204112.62561: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204112.63377: variable 'network_connections' from source: task vars 12755 1727204112.63388: variable 'controller_profile' from source: play vars 12755 1727204112.63594: variable 'controller_profile' from source: play vars 12755 1727204112.63678: variable '__network_service_name_default_initscripts' from source: role '' defaults 12755 1727204112.63815: variable '__network_service_name_default_initscripts' from source: role '' defaults 12755 1727204112.63967: variable '__network_packages_default_initscripts' from source: role '' defaults 12755 1727204112.64045: variable '__network_packages_default_initscripts' from source: role '' defaults 12755 1727204112.64702: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12755 1727204112.66165: variable 'network_connections' from source: task vars 12755 1727204112.66177: variable 'controller_profile' from source: play vars 12755 1727204112.66306: variable 'controller_profile' from source: play vars 12755 1727204112.66557: variable 'ansible_distribution' from source: facts 12755 1727204112.66560: variable '__network_rh_distros' from source: role '' defaults 12755 1727204112.66562: variable 'ansible_distribution_major_version' from source: facts 12755 1727204112.66565: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12755 1727204112.66868: variable 'ansible_distribution' from source: facts 12755 1727204112.67097: variable '__network_rh_distros' from source: role '' defaults 12755 1727204112.67100: variable 'ansible_distribution_major_version' from source: facts 12755 1727204112.67103: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12755 1727204112.67438: variable 'ansible_distribution' from source: facts 12755 1727204112.67704: variable '__network_rh_distros' from source: role '' defaults 12755 1727204112.67894: variable 'ansible_distribution_major_version' from source: facts 12755 1727204112.67897: variable 'network_provider' from source: set_fact 12755 1727204112.67900: variable 'omit' from source: magic vars 12755 1727204112.67902: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204112.67905: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204112.67907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204112.67919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204112.67936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204112.67978: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204112.68030: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204112.68040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204112.68279: Set connection var ansible_connection to ssh 12755 1727204112.68361: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204112.68368: Set connection var ansible_shell_type to sh 12755 1727204112.68388: Set connection var ansible_timeout to 10 12755 1727204112.68403: Set connection var ansible_shell_executable to /bin/sh 12755 1727204112.68415: Set connection var ansible_pipelining to False 12755 1727204112.68450: variable 'ansible_shell_executable' from source: unknown 12755 1727204112.68679: variable 'ansible_connection' from source: unknown 12755 1727204112.68682: variable 'ansible_module_compression' from source: unknown 12755 1727204112.68685: variable 'ansible_shell_type' from source: unknown 12755 1727204112.68687: variable 'ansible_shell_executable' from source: unknown 12755 1727204112.68692: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204112.68694: variable 'ansible_pipelining' from source: unknown 12755 1727204112.68696: variable 'ansible_timeout' from source: unknown 12755 1727204112.68698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204112.68861: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204112.68913: variable 'omit' from source: magic vars 12755 1727204112.68926: starting attempt loop 12755 1727204112.69095: running the handler 12755 1727204112.69224: variable 'ansible_facts' from source: unknown 12755 1727204112.70519: _low_level_execute_command(): starting 12755 1727204112.70538: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204112.71427: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204112.71454: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204112.71476: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204112.71531: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204112.71643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204112.73450: stdout chunk (state=3): >>>/root <<< 12755 1727204112.73611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204112.73828: stderr chunk (state=3): >>><<< 12755 1727204112.73831: stdout chunk (state=3): >>><<< 12755 1727204112.73910: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204112.73914: _low_level_execute_command(): starting 12755 1727204112.73918: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204112.7385137-15106-149950269645091 `" && echo ansible-tmp-1727204112.7385137-15106-149950269645091="` echo /root/.ansible/tmp/ansible-tmp-1727204112.7385137-15106-149950269645091 `" ) && sleep 0' 12755 1727204112.75305: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204112.75340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204112.75408: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204112.75491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204112.75564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204112.75883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204112.77907: stdout chunk (state=3): >>>ansible-tmp-1727204112.7385137-15106-149950269645091=/root/.ansible/tmp/ansible-tmp-1727204112.7385137-15106-149950269645091 <<< 12755 1727204112.78139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204112.78237: stderr chunk (state=3): >>><<< 12755 1727204112.78247: stdout chunk (state=3): >>><<< 12755 1727204112.78271: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204112.7385137-15106-149950269645091=/root/.ansible/tmp/ansible-tmp-1727204112.7385137-15106-149950269645091 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204112.78315: variable 'ansible_module_compression' from source: unknown 12755 1727204112.78377: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 12755 1727204112.78452: variable 'ansible_facts' from source: unknown 12755 1727204112.78676: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204112.7385137-15106-149950269645091/AnsiballZ_systemd.py 12755 1727204112.78940: Sending initial data 12755 1727204112.78943: Sent initial data (156 bytes) 12755 1727204112.79457: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204112.79473: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204112.79592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204112.79609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204112.79630: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204112.79907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204112.81562: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12755 1727204112.81575: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 12755 1727204112.81587: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 12755 1727204112.81607: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204112.81667: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204112.81734: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpn020l1eu /root/.ansible/tmp/ansible-tmp-1727204112.7385137-15106-149950269645091/AnsiballZ_systemd.py <<< 12755 1727204112.81754: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204112.7385137-15106-149950269645091/AnsiballZ_systemd.py" <<< 12755 1727204112.81785: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpn020l1eu" to remote "/root/.ansible/tmp/ansible-tmp-1727204112.7385137-15106-149950269645091/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204112.7385137-15106-149950269645091/AnsiballZ_systemd.py" <<< 12755 1727204112.84715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204112.84722: stdout chunk (state=3): >>><<< 12755 1727204112.84725: stderr chunk (state=3): >>><<< 12755 1727204112.84727: done transferring module to remote 12755 1727204112.84729: _low_level_execute_command(): starting 12755 1727204112.84735: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204112.7385137-15106-149950269645091/ /root/.ansible/tmp/ansible-tmp-1727204112.7385137-15106-149950269645091/AnsiballZ_systemd.py && sleep 0' 12755 1727204112.85209: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204112.85222: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204112.85239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204112.85261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204112.85301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12755 1727204112.85399: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 12755 1727204112.85429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204112.85446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204112.85533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204112.87805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204112.87996: stderr chunk (state=3): >>><<< 12755 1727204112.87999: stdout chunk (state=3): >>><<< 12755 1727204112.88004: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204112.88007: _low_level_execute_command(): starting 12755 1727204112.88010: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204112.7385137-15106-149950269645091/AnsiballZ_systemd.py && sleep 0' 12755 1727204112.88633: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204112.88662: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204112.88677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204112.88697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204112.88795: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204112.89014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204112.89109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204113.23282: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "651", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ExecMainStartTimestampMonotonic": "17567139", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "651", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "12148736", "MemoryAvailable": "infinity", "CPUUsageNSec": "885928000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "inf<<< 12755 1727204113.23516: stdout chunk (state=3): >>>inity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service multi-user.target cloud-init.service NetworkManager-wait-online.service network.target shutdown.target", "After": "dbus-broker.service system.slice dbus.socket systemd-journald.socket basic.target sysinit.target network-pre.target cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:51 EDT", "StateChangeTimestampMonotonic": "521403753", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:27 EDT", "InactiveExitTimestampMonotonic": "17567399", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ActiveEnterTimestampMonotonic": "18019295", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ConditionTimestampMonotonic": "17554557", "AssertTimestamp": "Tue 2024-09-24 14:45:27 EDT", "AssertTimestampMonotonic": "17554559", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac0fd3fc06b14ac59a7d5e4a43cc5865", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 12755 1727204113.25519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204113.25613: stderr chunk (state=3): >>>Shared connection to 10.31.11.210 closed. <<< 12755 1727204113.25897: stderr chunk (state=3): >>><<< 12755 1727204113.25901: stdout chunk (state=3): >>><<< 12755 1727204113.25906: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "651", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ExecMainStartTimestampMonotonic": "17567139", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "651", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "12148736", "MemoryAvailable": "infinity", "CPUUsageNSec": "885928000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service multi-user.target cloud-init.service NetworkManager-wait-online.service network.target shutdown.target", "After": "dbus-broker.service system.slice dbus.socket systemd-journald.socket basic.target sysinit.target network-pre.target cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:51 EDT", "StateChangeTimestampMonotonic": "521403753", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:27 EDT", "InactiveExitTimestampMonotonic": "17567399", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ActiveEnterTimestampMonotonic": "18019295", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ConditionTimestampMonotonic": "17554557", "AssertTimestamp": "Tue 2024-09-24 14:45:27 EDT", "AssertTimestampMonotonic": "17554559", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac0fd3fc06b14ac59a7d5e4a43cc5865", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204113.26360: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204112.7385137-15106-149950269645091/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204113.26422: _low_level_execute_command(): starting 12755 1727204113.26502: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204112.7385137-15106-149950269645091/ > /dev/null 2>&1 && sleep 0' 12755 1727204113.27643: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204113.28006: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 12755 1727204113.28023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204113.28066: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204113.28213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204113.30208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204113.30300: stderr chunk (state=3): >>><<< 12755 1727204113.30408: stdout chunk (state=3): >>><<< 12755 1727204113.30497: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204113.30501: handler run complete 12755 1727204113.30623: attempt loop complete, returning result 12755 1727204113.30632: _execute() done 12755 1727204113.30640: dumping result to json 12755 1727204113.30894: done dumping result, returning 12755 1727204113.30898: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-72e9-1a19-000000000087] 12755 1727204113.30900: sending task result for task 12b410aa-8751-72e9-1a19-000000000087 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204113.31117: no more pending results, returning what we have 12755 1727204113.31124: results queue empty 12755 1727204113.31126: checking for any_errors_fatal 12755 1727204113.31133: done checking for any_errors_fatal 12755 1727204113.31134: checking for max_fail_percentage 12755 1727204113.31137: done checking for max_fail_percentage 12755 1727204113.31138: checking to see if all hosts have failed and the running result is not ok 12755 1727204113.31139: done checking to see if all hosts have failed 12755 1727204113.31140: getting the remaining hosts for this loop 12755 1727204113.31142: done getting the remaining hosts for this loop 12755 1727204113.31147: getting the next task for host managed-node1 12755 1727204113.31156: done getting next task for host managed-node1 12755 1727204113.31160: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12755 1727204113.31164: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204113.31179: getting variables 12755 1727204113.31182: in VariableManager get_vars() 12755 1727204113.31248: Calling all_inventory to load vars for managed-node1 12755 1727204113.31252: Calling groups_inventory to load vars for managed-node1 12755 1727204113.31255: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204113.31268: Calling all_plugins_play to load vars for managed-node1 12755 1727204113.31272: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204113.31277: Calling groups_plugins_play to load vars for managed-node1 12755 1727204113.32331: done sending task result for task 12b410aa-8751-72e9-1a19-000000000087 12755 1727204113.32334: WORKER PROCESS EXITING 12755 1727204113.36509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204113.41658: done with get_vars() 12755 1727204113.41704: done getting variables 12755 1727204113.41775: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:55:13 -0400 (0:00:00.907) 0:00:38.653 ***** 12755 1727204113.41817: entering _queue_task() for managed-node1/service 12755 1727204113.42183: worker is 1 (out of 1 available) 12755 1727204113.42200: exiting _queue_task() for managed-node1/service 12755 1727204113.42215: done queuing things up, now waiting for results queue to drain 12755 1727204113.42216: waiting for pending results... 12755 1727204113.42497: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12755 1727204113.42669: in run() - task 12b410aa-8751-72e9-1a19-000000000088 12755 1727204113.42696: variable 'ansible_search_path' from source: unknown 12755 1727204113.42707: variable 'ansible_search_path' from source: unknown 12755 1727204113.42759: calling self._execute() 12755 1727204113.42951: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204113.42967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204113.42987: variable 'omit' from source: magic vars 12755 1727204113.43895: variable 'ansible_distribution_major_version' from source: facts 12755 1727204113.43899: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204113.44121: variable 'network_provider' from source: set_fact 12755 1727204113.44137: Evaluated conditional (network_provider == "nm"): True 12755 1727204113.44470: variable '__network_wpa_supplicant_required' from source: role '' defaults 12755 1727204113.44688: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12755 1727204113.45072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204113.49523: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204113.49621: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204113.49669: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204113.49714: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204113.49751: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204113.49851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204113.49941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204113.49944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204113.49995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204113.50019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204113.50087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204113.50126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204113.50167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204113.50266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204113.50270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204113.50308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204113.50343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204113.50384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204113.50438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204113.50461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204113.50696: variable 'network_connections' from source: task vars 12755 1727204113.50700: variable 'controller_profile' from source: play vars 12755 1727204113.50760: variable 'controller_profile' from source: play vars 12755 1727204113.50878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204113.51091: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204113.51144: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204113.51187: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204113.51229: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204113.51351: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204113.51354: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204113.51392: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204113.51442: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204113.51515: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204113.52034: variable 'network_connections' from source: task vars 12755 1727204113.52047: variable 'controller_profile' from source: play vars 12755 1727204113.52134: variable 'controller_profile' from source: play vars 12755 1727204113.52177: Evaluated conditional (__network_wpa_supplicant_required): False 12755 1727204113.52196: when evaluation is False, skipping this task 12755 1727204113.52198: _execute() done 12755 1727204113.52225: dumping result to json 12755 1727204113.52228: done dumping result, returning 12755 1727204113.52230: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-72e9-1a19-000000000088] 12755 1727204113.52295: sending task result for task 12b410aa-8751-72e9-1a19-000000000088 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 12755 1727204113.52490: no more pending results, returning what we have 12755 1727204113.52495: results queue empty 12755 1727204113.52496: checking for any_errors_fatal 12755 1727204113.52520: done checking for any_errors_fatal 12755 1727204113.52521: checking for max_fail_percentage 12755 1727204113.52523: done checking for max_fail_percentage 12755 1727204113.52525: checking to see if all hosts have failed and the running result is not ok 12755 1727204113.52526: done checking to see if all hosts have failed 12755 1727204113.52527: getting the remaining hosts for this loop 12755 1727204113.52529: done getting the remaining hosts for this loop 12755 1727204113.52534: getting the next task for host managed-node1 12755 1727204113.52543: done getting next task for host managed-node1 12755 1727204113.52547: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12755 1727204113.52551: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204113.52573: getting variables 12755 1727204113.52575: in VariableManager get_vars() 12755 1727204113.52750: Calling all_inventory to load vars for managed-node1 12755 1727204113.52754: Calling groups_inventory to load vars for managed-node1 12755 1727204113.52758: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204113.52766: done sending task result for task 12b410aa-8751-72e9-1a19-000000000088 12755 1727204113.52769: WORKER PROCESS EXITING 12755 1727204113.52781: Calling all_plugins_play to load vars for managed-node1 12755 1727204113.52785: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204113.52946: Calling groups_plugins_play to load vars for managed-node1 12755 1727204113.54554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204113.56715: done with get_vars() 12755 1727204113.56738: done getting variables 12755 1727204113.56792: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:55:13 -0400 (0:00:00.150) 0:00:38.803 ***** 12755 1727204113.56821: entering _queue_task() for managed-node1/service 12755 1727204113.57071: worker is 1 (out of 1 available) 12755 1727204113.57085: exiting _queue_task() for managed-node1/service 12755 1727204113.57102: done queuing things up, now waiting for results queue to drain 12755 1727204113.57104: waiting for pending results... 12755 1727204113.57303: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 12755 1727204113.57413: in run() - task 12b410aa-8751-72e9-1a19-000000000089 12755 1727204113.57434: variable 'ansible_search_path' from source: unknown 12755 1727204113.57438: variable 'ansible_search_path' from source: unknown 12755 1727204113.57473: calling self._execute() 12755 1727204113.57567: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204113.57572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204113.57582: variable 'omit' from source: magic vars 12755 1727204113.57910: variable 'ansible_distribution_major_version' from source: facts 12755 1727204113.57922: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204113.58022: variable 'network_provider' from source: set_fact 12755 1727204113.58030: Evaluated conditional (network_provider == "initscripts"): False 12755 1727204113.58033: when evaluation is False, skipping this task 12755 1727204113.58038: _execute() done 12755 1727204113.58042: dumping result to json 12755 1727204113.58047: done dumping result, returning 12755 1727204113.58055: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-72e9-1a19-000000000089] 12755 1727204113.58061: sending task result for task 12b410aa-8751-72e9-1a19-000000000089 12755 1727204113.58154: done sending task result for task 12b410aa-8751-72e9-1a19-000000000089 12755 1727204113.58157: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204113.58210: no more pending results, returning what we have 12755 1727204113.58214: results queue empty 12755 1727204113.58215: checking for any_errors_fatal 12755 1727204113.58223: done checking for any_errors_fatal 12755 1727204113.58224: checking for max_fail_percentage 12755 1727204113.58225: done checking for max_fail_percentage 12755 1727204113.58226: checking to see if all hosts have failed and the running result is not ok 12755 1727204113.58227: done checking to see if all hosts have failed 12755 1727204113.58228: getting the remaining hosts for this loop 12755 1727204113.58230: done getting the remaining hosts for this loop 12755 1727204113.58234: getting the next task for host managed-node1 12755 1727204113.58242: done getting next task for host managed-node1 12755 1727204113.58246: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12755 1727204113.58249: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204113.58276: getting variables 12755 1727204113.58278: in VariableManager get_vars() 12755 1727204113.58332: Calling all_inventory to load vars for managed-node1 12755 1727204113.58335: Calling groups_inventory to load vars for managed-node1 12755 1727204113.58338: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204113.58348: Calling all_plugins_play to load vars for managed-node1 12755 1727204113.58351: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204113.58355: Calling groups_plugins_play to load vars for managed-node1 12755 1727204113.60290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204113.61992: done with get_vars() 12755 1727204113.62025: done getting variables 12755 1727204113.62088: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:55:13 -0400 (0:00:00.053) 0:00:38.856 ***** 12755 1727204113.62127: entering _queue_task() for managed-node1/copy 12755 1727204113.62427: worker is 1 (out of 1 available) 12755 1727204113.62440: exiting _queue_task() for managed-node1/copy 12755 1727204113.62455: done queuing things up, now waiting for results queue to drain 12755 1727204113.62457: waiting for pending results... 12755 1727204113.62778: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12755 1727204113.62899: in run() - task 12b410aa-8751-72e9-1a19-00000000008a 12755 1727204113.63094: variable 'ansible_search_path' from source: unknown 12755 1727204113.63098: variable 'ansible_search_path' from source: unknown 12755 1727204113.63100: calling self._execute() 12755 1727204113.63103: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204113.63106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204113.63108: variable 'omit' from source: magic vars 12755 1727204113.63548: variable 'ansible_distribution_major_version' from source: facts 12755 1727204113.63569: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204113.63722: variable 'network_provider' from source: set_fact 12755 1727204113.63736: Evaluated conditional (network_provider == "initscripts"): False 12755 1727204113.63745: when evaluation is False, skipping this task 12755 1727204113.63753: _execute() done 12755 1727204113.63762: dumping result to json 12755 1727204113.63770: done dumping result, returning 12755 1727204113.63784: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-72e9-1a19-00000000008a] 12755 1727204113.63799: sending task result for task 12b410aa-8751-72e9-1a19-00000000008a skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12755 1727204113.63968: no more pending results, returning what we have 12755 1727204113.63973: results queue empty 12755 1727204113.63975: checking for any_errors_fatal 12755 1727204113.63980: done checking for any_errors_fatal 12755 1727204113.63982: checking for max_fail_percentage 12755 1727204113.63984: done checking for max_fail_percentage 12755 1727204113.63985: checking to see if all hosts have failed and the running result is not ok 12755 1727204113.63986: done checking to see if all hosts have failed 12755 1727204113.63987: getting the remaining hosts for this loop 12755 1727204113.63991: done getting the remaining hosts for this loop 12755 1727204113.64050: getting the next task for host managed-node1 12755 1727204113.64101: done getting next task for host managed-node1 12755 1727204113.64105: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12755 1727204113.64109: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204113.64132: done sending task result for task 12b410aa-8751-72e9-1a19-00000000008a 12755 1727204113.64135: WORKER PROCESS EXITING 12755 1727204113.64147: getting variables 12755 1727204113.64149: in VariableManager get_vars() 12755 1727204113.64206: Calling all_inventory to load vars for managed-node1 12755 1727204113.64209: Calling groups_inventory to load vars for managed-node1 12755 1727204113.64212: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204113.64225: Calling all_plugins_play to load vars for managed-node1 12755 1727204113.64228: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204113.64232: Calling groups_plugins_play to load vars for managed-node1 12755 1727204113.65461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204113.67553: done with get_vars() 12755 1727204113.67577: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:55:13 -0400 (0:00:00.055) 0:00:38.912 ***** 12755 1727204113.67653: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 12755 1727204113.67909: worker is 1 (out of 1 available) 12755 1727204113.67926: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 12755 1727204113.67942: done queuing things up, now waiting for results queue to drain 12755 1727204113.67944: waiting for pending results... 12755 1727204113.68141: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12755 1727204113.68248: in run() - task 12b410aa-8751-72e9-1a19-00000000008b 12755 1727204113.68261: variable 'ansible_search_path' from source: unknown 12755 1727204113.68265: variable 'ansible_search_path' from source: unknown 12755 1727204113.68303: calling self._execute() 12755 1727204113.68394: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204113.68402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204113.68411: variable 'omit' from source: magic vars 12755 1727204113.68743: variable 'ansible_distribution_major_version' from source: facts 12755 1727204113.68756: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204113.68763: variable 'omit' from source: magic vars 12755 1727204113.68812: variable 'omit' from source: magic vars 12755 1727204113.68958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204113.70857: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204113.70915: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204113.70945: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204113.70974: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204113.70999: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204113.71063: variable 'network_provider' from source: set_fact 12755 1727204113.71175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204113.71199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204113.71225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204113.71259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204113.71272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204113.71338: variable 'omit' from source: magic vars 12755 1727204113.71431: variable 'omit' from source: magic vars 12755 1727204113.71521: variable 'network_connections' from source: task vars 12755 1727204113.71530: variable 'controller_profile' from source: play vars 12755 1727204113.71585: variable 'controller_profile' from source: play vars 12755 1727204113.71707: variable 'omit' from source: magic vars 12755 1727204113.71715: variable '__lsr_ansible_managed' from source: task vars 12755 1727204113.71764: variable '__lsr_ansible_managed' from source: task vars 12755 1727204113.71942: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 12755 1727204113.72129: Loaded config def from plugin (lookup/template) 12755 1727204113.72132: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 12755 1727204113.72157: File lookup term: get_ansible_managed.j2 12755 1727204113.72160: variable 'ansible_search_path' from source: unknown 12755 1727204113.72167: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 12755 1727204113.72179: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 12755 1727204113.72201: variable 'ansible_search_path' from source: unknown 12755 1727204113.77595: variable 'ansible_managed' from source: unknown 12755 1727204113.77732: variable 'omit' from source: magic vars 12755 1727204113.77759: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204113.77781: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204113.77800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204113.77815: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204113.77828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204113.77853: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204113.77858: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204113.77861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204113.77942: Set connection var ansible_connection to ssh 12755 1727204113.77949: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204113.77953: Set connection var ansible_shell_type to sh 12755 1727204113.77966: Set connection var ansible_timeout to 10 12755 1727204113.77976: Set connection var ansible_shell_executable to /bin/sh 12755 1727204113.77978: Set connection var ansible_pipelining to False 12755 1727204113.78000: variable 'ansible_shell_executable' from source: unknown 12755 1727204113.78003: variable 'ansible_connection' from source: unknown 12755 1727204113.78008: variable 'ansible_module_compression' from source: unknown 12755 1727204113.78010: variable 'ansible_shell_type' from source: unknown 12755 1727204113.78015: variable 'ansible_shell_executable' from source: unknown 12755 1727204113.78018: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204113.78025: variable 'ansible_pipelining' from source: unknown 12755 1727204113.78028: variable 'ansible_timeout' from source: unknown 12755 1727204113.78035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204113.78147: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204113.78159: variable 'omit' from source: magic vars 12755 1727204113.78162: starting attempt loop 12755 1727204113.78166: running the handler 12755 1727204113.78180: _low_level_execute_command(): starting 12755 1727204113.78188: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204113.78727: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204113.78733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204113.78736: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204113.78738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204113.78793: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204113.78797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204113.78864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204113.80723: stdout chunk (state=3): >>>/root <<< 12755 1727204113.80872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204113.80878: stdout chunk (state=3): >>><<< 12755 1727204113.80886: stderr chunk (state=3): >>><<< 12755 1727204113.80910: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204113.80925: _low_level_execute_command(): starting 12755 1727204113.80931: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204113.8091068-15149-12824595606090 `" && echo ansible-tmp-1727204113.8091068-15149-12824595606090="` echo /root/.ansible/tmp/ansible-tmp-1727204113.8091068-15149-12824595606090 `" ) && sleep 0' 12755 1727204113.81395: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204113.81399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204113.81402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12755 1727204113.81411: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204113.81425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204113.81481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204113.81484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204113.81527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204113.83639: stdout chunk (state=3): >>>ansible-tmp-1727204113.8091068-15149-12824595606090=/root/.ansible/tmp/ansible-tmp-1727204113.8091068-15149-12824595606090 <<< 12755 1727204113.83765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204113.83817: stderr chunk (state=3): >>><<< 12755 1727204113.83823: stdout chunk (state=3): >>><<< 12755 1727204113.83840: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204113.8091068-15149-12824595606090=/root/.ansible/tmp/ansible-tmp-1727204113.8091068-15149-12824595606090 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204113.83879: variable 'ansible_module_compression' from source: unknown 12755 1727204113.83922: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 12755 1727204113.83963: variable 'ansible_facts' from source: unknown 12755 1727204113.84057: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204113.8091068-15149-12824595606090/AnsiballZ_network_connections.py 12755 1727204113.84172: Sending initial data 12755 1727204113.84176: Sent initial data (167 bytes) 12755 1727204113.84646: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204113.84649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204113.84656: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204113.84659: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204113.84716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204113.84720: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204113.84762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204113.86520: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 12755 1727204113.86524: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204113.86570: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204113.86618: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpkmfmkd3b /root/.ansible/tmp/ansible-tmp-1727204113.8091068-15149-12824595606090/AnsiballZ_network_connections.py <<< 12755 1727204113.86626: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204113.8091068-15149-12824595606090/AnsiballZ_network_connections.py" <<< 12755 1727204113.86656: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpkmfmkd3b" to remote "/root/.ansible/tmp/ansible-tmp-1727204113.8091068-15149-12824595606090/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204113.8091068-15149-12824595606090/AnsiballZ_network_connections.py" <<< 12755 1727204113.88043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204113.88052: stdout chunk (state=3): >>><<< 12755 1727204113.88151: stderr chunk (state=3): >>><<< 12755 1727204113.88154: done transferring module to remote 12755 1727204113.88157: _low_level_execute_command(): starting 12755 1727204113.88159: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204113.8091068-15149-12824595606090/ /root/.ansible/tmp/ansible-tmp-1727204113.8091068-15149-12824595606090/AnsiballZ_network_connections.py && sleep 0' 12755 1727204113.88683: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204113.88692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204113.88701: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204113.88721: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204113.88731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204113.88738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204113.88802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204113.88808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204113.88854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204113.91121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204113.91125: stdout chunk (state=3): >>><<< 12755 1727204113.91127: stderr chunk (state=3): >>><<< 12755 1727204113.91145: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204113.91239: _low_level_execute_command(): starting 12755 1727204113.91243: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204113.8091068-15149-12824595606090/AnsiballZ_network_connections.py && sleep 0' 12755 1727204113.91813: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204113.91927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204113.91933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204113.91935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204113.91940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204113.92003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204114.45099: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_a_o3o2lb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_a_o3o2lb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/d1ad3f47-f869-4876-b6a6-dbe0ff47e776: error=unknown <<< 12755 1727204114.45813: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 12755 1727204114.49098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204114.49102: stdout chunk (state=3): >>><<< 12755 1727204114.49119: stderr chunk (state=3): >>><<< 12755 1727204114.49139: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_a_o3o2lb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_a_o3o2lb/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/d1ad3f47-f869-4876-b6a6-dbe0ff47e776: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204114.49199: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204113.8091068-15149-12824595606090/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204114.49221: _low_level_execute_command(): starting 12755 1727204114.49396: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204113.8091068-15149-12824595606090/ > /dev/null 2>&1 && sleep 0' 12755 1727204114.50568: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204114.50781: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204114.50933: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204114.50971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204114.53172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204114.53176: stdout chunk (state=3): >>><<< 12755 1727204114.53187: stderr chunk (state=3): >>><<< 12755 1727204114.53210: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204114.53218: handler run complete 12755 1727204114.53263: attempt loop complete, returning result 12755 1727204114.53267: _execute() done 12755 1727204114.53269: dumping result to json 12755 1727204114.53277: done dumping result, returning 12755 1727204114.53290: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-72e9-1a19-00000000008b] 12755 1727204114.53297: sending task result for task 12b410aa-8751-72e9-1a19-00000000008b 12755 1727204114.53542: done sending task result for task 12b410aa-8751-72e9-1a19-00000000008b 12755 1727204114.53547: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 12755 1727204114.53680: no more pending results, returning what we have 12755 1727204114.53683: results queue empty 12755 1727204114.53685: checking for any_errors_fatal 12755 1727204114.53693: done checking for any_errors_fatal 12755 1727204114.53694: checking for max_fail_percentage 12755 1727204114.53696: done checking for max_fail_percentage 12755 1727204114.53697: checking to see if all hosts have failed and the running result is not ok 12755 1727204114.53698: done checking to see if all hosts have failed 12755 1727204114.53699: getting the remaining hosts for this loop 12755 1727204114.53700: done getting the remaining hosts for this loop 12755 1727204114.53892: getting the next task for host managed-node1 12755 1727204114.53901: done getting next task for host managed-node1 12755 1727204114.53905: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12755 1727204114.53909: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204114.53923: getting variables 12755 1727204114.53926: in VariableManager get_vars() 12755 1727204114.53992: Calling all_inventory to load vars for managed-node1 12755 1727204114.53996: Calling groups_inventory to load vars for managed-node1 12755 1727204114.53999: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204114.54011: Calling all_plugins_play to load vars for managed-node1 12755 1727204114.54014: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204114.54018: Calling groups_plugins_play to load vars for managed-node1 12755 1727204114.57536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204114.61226: done with get_vars() 12755 1727204114.61262: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:55:14 -0400 (0:00:00.937) 0:00:39.849 ***** 12755 1727204114.61365: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 12755 1727204114.61915: worker is 1 (out of 1 available) 12755 1727204114.61926: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 12755 1727204114.61939: done queuing things up, now waiting for results queue to drain 12755 1727204114.61941: waiting for pending results... 12755 1727204114.62182: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 12755 1727204114.62231: in run() - task 12b410aa-8751-72e9-1a19-00000000008c 12755 1727204114.62275: variable 'ansible_search_path' from source: unknown 12755 1727204114.62278: variable 'ansible_search_path' from source: unknown 12755 1727204114.62308: calling self._execute() 12755 1727204114.62438: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204114.62493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204114.62498: variable 'omit' from source: magic vars 12755 1727204114.63092: variable 'ansible_distribution_major_version' from source: facts 12755 1727204114.63113: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204114.63283: variable 'network_state' from source: role '' defaults 12755 1727204114.63345: Evaluated conditional (network_state != {}): False 12755 1727204114.63348: when evaluation is False, skipping this task 12755 1727204114.63363: _execute() done 12755 1727204114.63365: dumping result to json 12755 1727204114.63368: done dumping result, returning 12755 1727204114.63371: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-72e9-1a19-00000000008c] 12755 1727204114.63373: sending task result for task 12b410aa-8751-72e9-1a19-00000000008c skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204114.63547: no more pending results, returning what we have 12755 1727204114.63554: results queue empty 12755 1727204114.63556: checking for any_errors_fatal 12755 1727204114.63569: done checking for any_errors_fatal 12755 1727204114.63570: checking for max_fail_percentage 12755 1727204114.63572: done checking for max_fail_percentage 12755 1727204114.63573: checking to see if all hosts have failed and the running result is not ok 12755 1727204114.63574: done checking to see if all hosts have failed 12755 1727204114.63575: getting the remaining hosts for this loop 12755 1727204114.63577: done getting the remaining hosts for this loop 12755 1727204114.63582: getting the next task for host managed-node1 12755 1727204114.63695: done getting next task for host managed-node1 12755 1727204114.63700: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12755 1727204114.63705: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204114.63730: getting variables 12755 1727204114.63733: in VariableManager get_vars() 12755 1727204114.63998: Calling all_inventory to load vars for managed-node1 12755 1727204114.64002: Calling groups_inventory to load vars for managed-node1 12755 1727204114.64005: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204114.64017: Calling all_plugins_play to load vars for managed-node1 12755 1727204114.64020: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204114.64024: Calling groups_plugins_play to load vars for managed-node1 12755 1727204114.64711: done sending task result for task 12b410aa-8751-72e9-1a19-00000000008c 12755 1727204114.64715: WORKER PROCESS EXITING 12755 1727204114.66291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204114.69168: done with get_vars() 12755 1727204114.69207: done getting variables 12755 1727204114.69277: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:55:14 -0400 (0:00:00.079) 0:00:39.928 ***** 12755 1727204114.69319: entering _queue_task() for managed-node1/debug 12755 1727204114.69675: worker is 1 (out of 1 available) 12755 1727204114.69793: exiting _queue_task() for managed-node1/debug 12755 1727204114.69808: done queuing things up, now waiting for results queue to drain 12755 1727204114.69809: waiting for pending results... 12755 1727204114.70022: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12755 1727204114.70192: in run() - task 12b410aa-8751-72e9-1a19-00000000008d 12755 1727204114.70218: variable 'ansible_search_path' from source: unknown 12755 1727204114.70227: variable 'ansible_search_path' from source: unknown 12755 1727204114.70272: calling self._execute() 12755 1727204114.70394: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204114.70409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204114.70426: variable 'omit' from source: magic vars 12755 1727204114.70885: variable 'ansible_distribution_major_version' from source: facts 12755 1727204114.70908: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204114.70921: variable 'omit' from source: magic vars 12755 1727204114.71004: variable 'omit' from source: magic vars 12755 1727204114.71060: variable 'omit' from source: magic vars 12755 1727204114.71112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204114.71176: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204114.71208: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204114.71234: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204114.71254: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204114.71300: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204114.71310: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204114.71319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204114.71520: Set connection var ansible_connection to ssh 12755 1727204114.71550: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204114.71560: Set connection var ansible_shell_type to sh 12755 1727204114.71593: Set connection var ansible_timeout to 10 12755 1727204114.71607: Set connection var ansible_shell_executable to /bin/sh 12755 1727204114.71618: Set connection var ansible_pipelining to False 12755 1727204114.71649: variable 'ansible_shell_executable' from source: unknown 12755 1727204114.71658: variable 'ansible_connection' from source: unknown 12755 1727204114.71675: variable 'ansible_module_compression' from source: unknown 12755 1727204114.71684: variable 'ansible_shell_type' from source: unknown 12755 1727204114.71698: variable 'ansible_shell_executable' from source: unknown 12755 1727204114.71707: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204114.71717: variable 'ansible_pipelining' from source: unknown 12755 1727204114.71725: variable 'ansible_timeout' from source: unknown 12755 1727204114.71734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204114.71924: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204114.71953: variable 'omit' from source: magic vars 12755 1727204114.71995: starting attempt loop 12755 1727204114.71999: running the handler 12755 1727204114.72159: variable '__network_connections_result' from source: set_fact 12755 1727204114.72241: handler run complete 12755 1727204114.72258: attempt loop complete, returning result 12755 1727204114.72266: _execute() done 12755 1727204114.72295: dumping result to json 12755 1727204114.72298: done dumping result, returning 12755 1727204114.72309: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-72e9-1a19-00000000008d] 12755 1727204114.72351: sending task result for task 12b410aa-8751-72e9-1a19-00000000008d 12755 1727204114.72708: done sending task result for task 12b410aa-8751-72e9-1a19-00000000008d 12755 1727204114.72712: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "" ] } 12755 1727204114.72786: no more pending results, returning what we have 12755 1727204114.72792: results queue empty 12755 1727204114.72793: checking for any_errors_fatal 12755 1727204114.72798: done checking for any_errors_fatal 12755 1727204114.72799: checking for max_fail_percentage 12755 1727204114.72801: done checking for max_fail_percentage 12755 1727204114.72802: checking to see if all hosts have failed and the running result is not ok 12755 1727204114.72803: done checking to see if all hosts have failed 12755 1727204114.72804: getting the remaining hosts for this loop 12755 1727204114.72806: done getting the remaining hosts for this loop 12755 1727204114.72810: getting the next task for host managed-node1 12755 1727204114.72816: done getting next task for host managed-node1 12755 1727204114.72820: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12755 1727204114.72824: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204114.72838: getting variables 12755 1727204114.72839: in VariableManager get_vars() 12755 1727204114.72986: Calling all_inventory to load vars for managed-node1 12755 1727204114.72992: Calling groups_inventory to load vars for managed-node1 12755 1727204114.72995: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204114.73005: Calling all_plugins_play to load vars for managed-node1 12755 1727204114.73009: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204114.73012: Calling groups_plugins_play to load vars for managed-node1 12755 1727204114.75102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204114.78096: done with get_vars() 12755 1727204114.78131: done getting variables 12755 1727204114.78202: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:55:14 -0400 (0:00:00.089) 0:00:40.018 ***** 12755 1727204114.78242: entering _queue_task() for managed-node1/debug 12755 1727204114.78583: worker is 1 (out of 1 available) 12755 1727204114.78597: exiting _queue_task() for managed-node1/debug 12755 1727204114.78612: done queuing things up, now waiting for results queue to drain 12755 1727204114.78614: waiting for pending results... 12755 1727204114.78923: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12755 1727204114.79094: in run() - task 12b410aa-8751-72e9-1a19-00000000008e 12755 1727204114.79117: variable 'ansible_search_path' from source: unknown 12755 1727204114.79126: variable 'ansible_search_path' from source: unknown 12755 1727204114.79174: calling self._execute() 12755 1727204114.79292: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204114.79307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204114.79324: variable 'omit' from source: magic vars 12755 1727204114.79769: variable 'ansible_distribution_major_version' from source: facts 12755 1727204114.79791: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204114.79994: variable 'omit' from source: magic vars 12755 1727204114.79998: variable 'omit' from source: magic vars 12755 1727204114.80000: variable 'omit' from source: magic vars 12755 1727204114.80003: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204114.80018: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204114.80046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204114.80071: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204114.80091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204114.80136: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204114.80145: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204114.80155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204114.80284: Set connection var ansible_connection to ssh 12755 1727204114.80300: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204114.80308: Set connection var ansible_shell_type to sh 12755 1727204114.80327: Set connection var ansible_timeout to 10 12755 1727204114.80343: Set connection var ansible_shell_executable to /bin/sh 12755 1727204114.80356: Set connection var ansible_pipelining to False 12755 1727204114.80386: variable 'ansible_shell_executable' from source: unknown 12755 1727204114.80397: variable 'ansible_connection' from source: unknown 12755 1727204114.80405: variable 'ansible_module_compression' from source: unknown 12755 1727204114.80412: variable 'ansible_shell_type' from source: unknown 12755 1727204114.80443: variable 'ansible_shell_executable' from source: unknown 12755 1727204114.80446: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204114.80449: variable 'ansible_pipelining' from source: unknown 12755 1727204114.80451: variable 'ansible_timeout' from source: unknown 12755 1727204114.80453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204114.80624: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204114.80660: variable 'omit' from source: magic vars 12755 1727204114.80663: starting attempt loop 12755 1727204114.80666: running the handler 12755 1727204114.80769: variable '__network_connections_result' from source: set_fact 12755 1727204114.80831: variable '__network_connections_result' from source: set_fact 12755 1727204114.80980: handler run complete 12755 1727204114.81027: attempt loop complete, returning result 12755 1727204114.81036: _execute() done 12755 1727204114.81043: dumping result to json 12755 1727204114.81053: done dumping result, returning 12755 1727204114.81066: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-72e9-1a19-00000000008e] 12755 1727204114.81075: sending task result for task 12b410aa-8751-72e9-1a19-00000000008e 12755 1727204114.81278: done sending task result for task 12b410aa-8751-72e9-1a19-00000000008e 12755 1727204114.81282: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 12755 1727204114.81416: no more pending results, returning what we have 12755 1727204114.81420: results queue empty 12755 1727204114.81422: checking for any_errors_fatal 12755 1727204114.81428: done checking for any_errors_fatal 12755 1727204114.81429: checking for max_fail_percentage 12755 1727204114.81430: done checking for max_fail_percentage 12755 1727204114.81431: checking to see if all hosts have failed and the running result is not ok 12755 1727204114.81433: done checking to see if all hosts have failed 12755 1727204114.81433: getting the remaining hosts for this loop 12755 1727204114.81435: done getting the remaining hosts for this loop 12755 1727204114.81440: getting the next task for host managed-node1 12755 1727204114.81448: done getting next task for host managed-node1 12755 1727204114.81452: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12755 1727204114.81456: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204114.81472: getting variables 12755 1727204114.81474: in VariableManager get_vars() 12755 1727204114.81733: Calling all_inventory to load vars for managed-node1 12755 1727204114.81737: Calling groups_inventory to load vars for managed-node1 12755 1727204114.81740: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204114.81751: Calling all_plugins_play to load vars for managed-node1 12755 1727204114.81754: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204114.81758: Calling groups_plugins_play to load vars for managed-node1 12755 1727204114.83971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204114.86893: done with get_vars() 12755 1727204114.86929: done getting variables 12755 1727204114.87002: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:55:14 -0400 (0:00:00.088) 0:00:40.106 ***** 12755 1727204114.87048: entering _queue_task() for managed-node1/debug 12755 1727204114.87408: worker is 1 (out of 1 available) 12755 1727204114.87423: exiting _queue_task() for managed-node1/debug 12755 1727204114.87437: done queuing things up, now waiting for results queue to drain 12755 1727204114.87438: waiting for pending results... 12755 1727204114.87907: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12755 1727204114.87919: in run() - task 12b410aa-8751-72e9-1a19-00000000008f 12755 1727204114.87940: variable 'ansible_search_path' from source: unknown 12755 1727204114.87949: variable 'ansible_search_path' from source: unknown 12755 1727204114.87997: calling self._execute() 12755 1727204114.88119: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204114.88140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204114.88158: variable 'omit' from source: magic vars 12755 1727204114.88611: variable 'ansible_distribution_major_version' from source: facts 12755 1727204114.88632: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204114.88803: variable 'network_state' from source: role '' defaults 12755 1727204114.88823: Evaluated conditional (network_state != {}): False 12755 1727204114.88833: when evaluation is False, skipping this task 12755 1727204114.88841: _execute() done 12755 1727204114.88850: dumping result to json 12755 1727204114.88896: done dumping result, returning 12755 1727204114.88899: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-72e9-1a19-00000000008f] 12755 1727204114.88901: sending task result for task 12b410aa-8751-72e9-1a19-00000000008f skipping: [managed-node1] => { "false_condition": "network_state != {}" } 12755 1727204114.89057: no more pending results, returning what we have 12755 1727204114.89061: results queue empty 12755 1727204114.89063: checking for any_errors_fatal 12755 1727204114.89074: done checking for any_errors_fatal 12755 1727204114.89075: checking for max_fail_percentage 12755 1727204114.89077: done checking for max_fail_percentage 12755 1727204114.89078: checking to see if all hosts have failed and the running result is not ok 12755 1727204114.89079: done checking to see if all hosts have failed 12755 1727204114.89080: getting the remaining hosts for this loop 12755 1727204114.89082: done getting the remaining hosts for this loop 12755 1727204114.89086: getting the next task for host managed-node1 12755 1727204114.89096: done getting next task for host managed-node1 12755 1727204114.89102: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12755 1727204114.89107: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204114.89131: getting variables 12755 1727204114.89134: in VariableManager get_vars() 12755 1727204114.89495: Calling all_inventory to load vars for managed-node1 12755 1727204114.89499: Calling groups_inventory to load vars for managed-node1 12755 1727204114.89503: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204114.89510: done sending task result for task 12b410aa-8751-72e9-1a19-00000000008f 12755 1727204114.89514: WORKER PROCESS EXITING 12755 1727204114.89524: Calling all_plugins_play to load vars for managed-node1 12755 1727204114.89528: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204114.89532: Calling groups_plugins_play to load vars for managed-node1 12755 1727204114.91836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204114.94907: done with get_vars() 12755 1727204114.94948: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:55:14 -0400 (0:00:00.080) 0:00:40.186 ***** 12755 1727204114.95071: entering _queue_task() for managed-node1/ping 12755 1727204114.95527: worker is 1 (out of 1 available) 12755 1727204114.95541: exiting _queue_task() for managed-node1/ping 12755 1727204114.95555: done queuing things up, now waiting for results queue to drain 12755 1727204114.95557: waiting for pending results... 12755 1727204114.95784: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 12755 1727204114.95955: in run() - task 12b410aa-8751-72e9-1a19-000000000090 12755 1727204114.95977: variable 'ansible_search_path' from source: unknown 12755 1727204114.95985: variable 'ansible_search_path' from source: unknown 12755 1727204114.96036: calling self._execute() 12755 1727204114.96156: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204114.96170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204114.96188: variable 'omit' from source: magic vars 12755 1727204114.96643: variable 'ansible_distribution_major_version' from source: facts 12755 1727204114.96668: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204114.96895: variable 'omit' from source: magic vars 12755 1727204114.96898: variable 'omit' from source: magic vars 12755 1727204114.96901: variable 'omit' from source: magic vars 12755 1727204114.96903: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204114.96906: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204114.96921: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204114.96948: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204114.96967: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204114.97016: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204114.97029: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204114.97038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204114.97172: Set connection var ansible_connection to ssh 12755 1727204114.97185: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204114.97195: Set connection var ansible_shell_type to sh 12755 1727204114.97214: Set connection var ansible_timeout to 10 12755 1727204114.97228: Set connection var ansible_shell_executable to /bin/sh 12755 1727204114.97349: Set connection var ansible_pipelining to False 12755 1727204114.97352: variable 'ansible_shell_executable' from source: unknown 12755 1727204114.97355: variable 'ansible_connection' from source: unknown 12755 1727204114.97357: variable 'ansible_module_compression' from source: unknown 12755 1727204114.97359: variable 'ansible_shell_type' from source: unknown 12755 1727204114.97362: variable 'ansible_shell_executable' from source: unknown 12755 1727204114.97364: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204114.97366: variable 'ansible_pipelining' from source: unknown 12755 1727204114.97368: variable 'ansible_timeout' from source: unknown 12755 1727204114.97370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204114.97576: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204114.97597: variable 'omit' from source: magic vars 12755 1727204114.97608: starting attempt loop 12755 1727204114.97615: running the handler 12755 1727204114.97635: _low_level_execute_command(): starting 12755 1727204114.97649: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204114.98426: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204114.98451: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204114.98469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204114.98567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204114.98604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204114.98630: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204114.98649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204114.98794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204115.00616: stdout chunk (state=3): >>>/root <<< 12755 1727204115.00965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204115.00968: stdout chunk (state=3): >>><<< 12755 1727204115.00971: stderr chunk (state=3): >>><<< 12755 1727204115.00994: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204115.01197: _low_level_execute_command(): starting 12755 1727204115.01201: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204115.010015-15182-76104943698960 `" && echo ansible-tmp-1727204115.010015-15182-76104943698960="` echo /root/.ansible/tmp/ansible-tmp-1727204115.010015-15182-76104943698960 `" ) && sleep 0' 12755 1727204115.02244: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204115.02259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 12755 1727204115.02286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204115.02309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204115.02321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204115.02520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204115.02538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204115.02617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204115.04732: stdout chunk (state=3): >>>ansible-tmp-1727204115.010015-15182-76104943698960=/root/.ansible/tmp/ansible-tmp-1727204115.010015-15182-76104943698960 <<< 12755 1727204115.04850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204115.05100: stderr chunk (state=3): >>><<< 12755 1727204115.05103: stdout chunk (state=3): >>><<< 12755 1727204115.05106: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204115.010015-15182-76104943698960=/root/.ansible/tmp/ansible-tmp-1727204115.010015-15182-76104943698960 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204115.05136: variable 'ansible_module_compression' from source: unknown 12755 1727204115.05297: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 12755 1727204115.05324: variable 'ansible_facts' from source: unknown 12755 1727204115.05493: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204115.010015-15182-76104943698960/AnsiballZ_ping.py 12755 1727204115.05904: Sending initial data 12755 1727204115.05915: Sent initial data (151 bytes) 12755 1727204115.07387: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204115.07420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204115.07424: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204115.07629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204115.07675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204115.09458: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 12755 1727204115.09556: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204115.09578: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204115.09624: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmppztimnbz /root/.ansible/tmp/ansible-tmp-1727204115.010015-15182-76104943698960/AnsiballZ_ping.py <<< 12755 1727204115.09659: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204115.010015-15182-76104943698960/AnsiballZ_ping.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmppztimnbz" to remote "/root/.ansible/tmp/ansible-tmp-1727204115.010015-15182-76104943698960/AnsiballZ_ping.py" <<< 12755 1727204115.09676: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204115.010015-15182-76104943698960/AnsiballZ_ping.py" <<< 12755 1727204115.11611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204115.11720: stderr chunk (state=3): >>><<< 12755 1727204115.11723: stdout chunk (state=3): >>><<< 12755 1727204115.11726: done transferring module to remote 12755 1727204115.11728: _low_level_execute_command(): starting 12755 1727204115.11731: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204115.010015-15182-76104943698960/ /root/.ansible/tmp/ansible-tmp-1727204115.010015-15182-76104943698960/AnsiballZ_ping.py && sleep 0' 12755 1727204115.13096: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204115.13112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204115.13170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204115.13183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204115.13285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204115.13395: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204115.13560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204115.15562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204115.15714: stderr chunk (state=3): >>><<< 12755 1727204115.15725: stdout chunk (state=3): >>><<< 12755 1727204115.15749: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204115.15759: _low_level_execute_command(): starting 12755 1727204115.15771: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204115.010015-15182-76104943698960/AnsiballZ_ping.py && sleep 0' 12755 1727204115.16445: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204115.16474: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204115.16512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204115.16621: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 12755 1727204115.16647: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204115.16670: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204115.16750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204115.34711: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 12755 1727204115.36103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204115.36169: stderr chunk (state=3): >>><<< 12755 1727204115.36173: stdout chunk (state=3): >>><<< 12755 1727204115.36191: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204115.36217: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204115.010015-15182-76104943698960/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204115.36228: _low_level_execute_command(): starting 12755 1727204115.36234: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204115.010015-15182-76104943698960/ > /dev/null 2>&1 && sleep 0' 12755 1727204115.36727: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204115.36731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204115.36734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 12755 1727204115.36737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204115.36739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204115.36792: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204115.36796: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204115.36850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204115.38829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204115.38879: stderr chunk (state=3): >>><<< 12755 1727204115.38883: stdout chunk (state=3): >>><<< 12755 1727204115.38902: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204115.38912: handler run complete 12755 1727204115.38929: attempt loop complete, returning result 12755 1727204115.38933: _execute() done 12755 1727204115.38936: dumping result to json 12755 1727204115.38939: done dumping result, returning 12755 1727204115.38949: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-72e9-1a19-000000000090] 12755 1727204115.38953: sending task result for task 12b410aa-8751-72e9-1a19-000000000090 12755 1727204115.39052: done sending task result for task 12b410aa-8751-72e9-1a19-000000000090 12755 1727204115.39054: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 12755 1727204115.39129: no more pending results, returning what we have 12755 1727204115.39133: results queue empty 12755 1727204115.39134: checking for any_errors_fatal 12755 1727204115.39141: done checking for any_errors_fatal 12755 1727204115.39142: checking for max_fail_percentage 12755 1727204115.39144: done checking for max_fail_percentage 12755 1727204115.39145: checking to see if all hosts have failed and the running result is not ok 12755 1727204115.39147: done checking to see if all hosts have failed 12755 1727204115.39147: getting the remaining hosts for this loop 12755 1727204115.39150: done getting the remaining hosts for this loop 12755 1727204115.39155: getting the next task for host managed-node1 12755 1727204115.39166: done getting next task for host managed-node1 12755 1727204115.39170: ^ task is: TASK: meta (role_complete) 12755 1727204115.39173: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204115.39187: getting variables 12755 1727204115.39191: in VariableManager get_vars() 12755 1727204115.39249: Calling all_inventory to load vars for managed-node1 12755 1727204115.39253: Calling groups_inventory to load vars for managed-node1 12755 1727204115.39256: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204115.39267: Calling all_plugins_play to load vars for managed-node1 12755 1727204115.39271: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204115.39274: Calling groups_plugins_play to load vars for managed-node1 12755 1727204115.40508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204115.42101: done with get_vars() 12755 1727204115.42132: done getting variables 12755 1727204115.42204: done queuing things up, now waiting for results queue to drain 12755 1727204115.42206: results queue empty 12755 1727204115.42207: checking for any_errors_fatal 12755 1727204115.42209: done checking for any_errors_fatal 12755 1727204115.42210: checking for max_fail_percentage 12755 1727204115.42211: done checking for max_fail_percentage 12755 1727204115.42211: checking to see if all hosts have failed and the running result is not ok 12755 1727204115.42212: done checking to see if all hosts have failed 12755 1727204115.42212: getting the remaining hosts for this loop 12755 1727204115.42213: done getting the remaining hosts for this loop 12755 1727204115.42215: getting the next task for host managed-node1 12755 1727204115.42222: done getting next task for host managed-node1 12755 1727204115.42224: ^ task is: TASK: From the active connection, get the port1 profile "{{ port1_profile }}" 12755 1727204115.42225: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204115.42227: getting variables 12755 1727204115.42228: in VariableManager get_vars() 12755 1727204115.42247: Calling all_inventory to load vars for managed-node1 12755 1727204115.42249: Calling groups_inventory to load vars for managed-node1 12755 1727204115.42251: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204115.42255: Calling all_plugins_play to load vars for managed-node1 12755 1727204115.42257: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204115.42259: Calling groups_plugins_play to load vars for managed-node1 12755 1727204115.47190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204115.50099: done with get_vars() 12755 1727204115.50140: done getting variables 12755 1727204115.50199: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204115.50332: variable 'port1_profile' from source: play vars TASK [From the active connection, get the port1 profile "bond0.0"] ************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:104 Tuesday 24 September 2024 14:55:15 -0400 (0:00:00.552) 0:00:40.739 ***** 12755 1727204115.50365: entering _queue_task() for managed-node1/command 12755 1727204115.50828: worker is 1 (out of 1 available) 12755 1727204115.50845: exiting _queue_task() for managed-node1/command 12755 1727204115.50859: done queuing things up, now waiting for results queue to drain 12755 1727204115.50861: waiting for pending results... 12755 1727204115.51312: running TaskExecutor() for managed-node1/TASK: From the active connection, get the port1 profile "bond0.0" 12755 1727204115.51325: in run() - task 12b410aa-8751-72e9-1a19-0000000000c0 12755 1727204115.51330: variable 'ansible_search_path' from source: unknown 12755 1727204115.51369: calling self._execute() 12755 1727204115.51497: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204115.51504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204115.51516: variable 'omit' from source: magic vars 12755 1727204115.51852: variable 'ansible_distribution_major_version' from source: facts 12755 1727204115.51865: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204115.51972: variable 'network_provider' from source: set_fact 12755 1727204115.51978: Evaluated conditional (network_provider == "nm"): True 12755 1727204115.51986: variable 'omit' from source: magic vars 12755 1727204115.52008: variable 'omit' from source: magic vars 12755 1727204115.52091: variable 'port1_profile' from source: play vars 12755 1727204115.52109: variable 'omit' from source: magic vars 12755 1727204115.52151: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204115.52182: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204115.52207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204115.52226: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204115.52237: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204115.52267: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204115.52271: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204115.52273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204115.52367: Set connection var ansible_connection to ssh 12755 1727204115.52373: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204115.52377: Set connection var ansible_shell_type to sh 12755 1727204115.52391: Set connection var ansible_timeout to 10 12755 1727204115.52397: Set connection var ansible_shell_executable to /bin/sh 12755 1727204115.52403: Set connection var ansible_pipelining to False 12755 1727204115.52427: variable 'ansible_shell_executable' from source: unknown 12755 1727204115.52430: variable 'ansible_connection' from source: unknown 12755 1727204115.52433: variable 'ansible_module_compression' from source: unknown 12755 1727204115.52437: variable 'ansible_shell_type' from source: unknown 12755 1727204115.52440: variable 'ansible_shell_executable' from source: unknown 12755 1727204115.52444: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204115.52449: variable 'ansible_pipelining' from source: unknown 12755 1727204115.52453: variable 'ansible_timeout' from source: unknown 12755 1727204115.52458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204115.52583: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204115.52597: variable 'omit' from source: magic vars 12755 1727204115.52603: starting attempt loop 12755 1727204115.52606: running the handler 12755 1727204115.52622: _low_level_execute_command(): starting 12755 1727204115.52631: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204115.53158: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204115.53197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204115.53201: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 12755 1727204115.53203: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204115.53206: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204115.53260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204115.53267: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204115.53321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204115.55103: stdout chunk (state=3): >>>/root <<< 12755 1727204115.55210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204115.55263: stderr chunk (state=3): >>><<< 12755 1727204115.55266: stdout chunk (state=3): >>><<< 12755 1727204115.55294: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204115.55307: _low_level_execute_command(): starting 12755 1727204115.55313: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204115.5529072-15208-141185850305405 `" && echo ansible-tmp-1727204115.5529072-15208-141185850305405="` echo /root/.ansible/tmp/ansible-tmp-1727204115.5529072-15208-141185850305405 `" ) && sleep 0' 12755 1727204115.55785: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204115.55790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204115.55801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12755 1727204115.55805: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204115.55807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204115.55856: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204115.55861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204115.55909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204115.58038: stdout chunk (state=3): >>>ansible-tmp-1727204115.5529072-15208-141185850305405=/root/.ansible/tmp/ansible-tmp-1727204115.5529072-15208-141185850305405 <<< 12755 1727204115.58154: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204115.58208: stderr chunk (state=3): >>><<< 12755 1727204115.58212: stdout chunk (state=3): >>><<< 12755 1727204115.58233: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204115.5529072-15208-141185850305405=/root/.ansible/tmp/ansible-tmp-1727204115.5529072-15208-141185850305405 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204115.58262: variable 'ansible_module_compression' from source: unknown 12755 1727204115.58313: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12755 1727204115.58351: variable 'ansible_facts' from source: unknown 12755 1727204115.58414: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204115.5529072-15208-141185850305405/AnsiballZ_command.py 12755 1727204115.58552: Sending initial data 12755 1727204115.58555: Sent initial data (156 bytes) 12755 1727204115.59036: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204115.59040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204115.59042: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 12755 1727204115.59045: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204115.59049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204115.59105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204115.59108: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204115.59161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204115.60922: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204115.60983: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204115.61034: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmp5yyor1cv /root/.ansible/tmp/ansible-tmp-1727204115.5529072-15208-141185850305405/AnsiballZ_command.py <<< 12755 1727204115.61038: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204115.5529072-15208-141185850305405/AnsiballZ_command.py" <<< 12755 1727204115.61088: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmp5yyor1cv" to remote "/root/.ansible/tmp/ansible-tmp-1727204115.5529072-15208-141185850305405/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204115.5529072-15208-141185850305405/AnsiballZ_command.py" <<< 12755 1727204115.62268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204115.62329: stderr chunk (state=3): >>><<< 12755 1727204115.62333: stdout chunk (state=3): >>><<< 12755 1727204115.62351: done transferring module to remote 12755 1727204115.62363: _low_level_execute_command(): starting 12755 1727204115.62368: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204115.5529072-15208-141185850305405/ /root/.ansible/tmp/ansible-tmp-1727204115.5529072-15208-141185850305405/AnsiballZ_command.py && sleep 0' 12755 1727204115.62816: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204115.62819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204115.62822: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204115.62824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204115.62883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204115.62886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204115.62938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204115.65014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204115.65020: stdout chunk (state=3): >>><<< 12755 1727204115.65023: stderr chunk (state=3): >>><<< 12755 1727204115.65123: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204115.65127: _low_level_execute_command(): starting 12755 1727204115.65129: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204115.5529072-15208-141185850305405/AnsiballZ_command.py && sleep 0' 12755 1727204115.65776: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 12755 1727204115.65792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204115.65808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204115.65902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204115.85952: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.0"], "start": "2024-09-24 14:55:15.839449", "end": "2024-09-24 14:55:15.858556", "delta": "0:00:00.019107", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12755 1727204115.87866: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204115.87887: stdout chunk (state=3): >>><<< 12755 1727204115.87905: stderr chunk (state=3): >>><<< 12755 1727204115.87936: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.0"], "start": "2024-09-24 14:55:15.839449", "end": "2024-09-24 14:55:15.858556", "delta": "0:00:00.019107", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204115.87994: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli c show --active bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204115.5529072-15208-141185850305405/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204115.88013: _low_level_execute_command(): starting 12755 1727204115.88028: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204115.5529072-15208-141185850305405/ > /dev/null 2>&1 && sleep 0' 12755 1727204115.88675: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204115.88696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204115.88713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204115.88810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204115.88859: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204115.88884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204115.88909: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204115.88997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204115.90978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204115.91030: stderr chunk (state=3): >>><<< 12755 1727204115.91033: stdout chunk (state=3): >>><<< 12755 1727204115.91050: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204115.91057: handler run complete 12755 1727204115.91080: Evaluated conditional (False): False 12755 1727204115.91093: attempt loop complete, returning result 12755 1727204115.91096: _execute() done 12755 1727204115.91098: dumping result to json 12755 1727204115.91109: done dumping result, returning 12755 1727204115.91120: done running TaskExecutor() for managed-node1/TASK: From the active connection, get the port1 profile "bond0.0" [12b410aa-8751-72e9-1a19-0000000000c0] 12755 1727204115.91123: sending task result for task 12b410aa-8751-72e9-1a19-0000000000c0 12755 1727204115.91235: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000c0 12755 1727204115.91238: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "nmcli", "c", "show", "--active", "bond0.0" ], "delta": "0:00:00.019107", "end": "2024-09-24 14:55:15.858556", "rc": 0, "start": "2024-09-24 14:55:15.839449" } 12755 1727204115.91341: no more pending results, returning what we have 12755 1727204115.91345: results queue empty 12755 1727204115.91346: checking for any_errors_fatal 12755 1727204115.91351: done checking for any_errors_fatal 12755 1727204115.91352: checking for max_fail_percentage 12755 1727204115.91353: done checking for max_fail_percentage 12755 1727204115.91354: checking to see if all hosts have failed and the running result is not ok 12755 1727204115.91355: done checking to see if all hosts have failed 12755 1727204115.91356: getting the remaining hosts for this loop 12755 1727204115.91363: done getting the remaining hosts for this loop 12755 1727204115.91367: getting the next task for host managed-node1 12755 1727204115.91377: done getting next task for host managed-node1 12755 1727204115.91380: ^ task is: TASK: From the active connection, get the port2 profile "{{ port2_profile }}" 12755 1727204115.91382: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204115.91386: getting variables 12755 1727204115.91388: in VariableManager get_vars() 12755 1727204115.91472: Calling all_inventory to load vars for managed-node1 12755 1727204115.91476: Calling groups_inventory to load vars for managed-node1 12755 1727204115.91478: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204115.91491: Calling all_plugins_play to load vars for managed-node1 12755 1727204115.91494: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204115.91498: Calling groups_plugins_play to load vars for managed-node1 12755 1727204115.93400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204115.95185: done with get_vars() 12755 1727204115.95221: done getting variables 12755 1727204115.95292: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204115.95426: variable 'port2_profile' from source: play vars TASK [From the active connection, get the port2 profile "bond0.1"] ************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:111 Tuesday 24 September 2024 14:55:15 -0400 (0:00:00.450) 0:00:41.190 ***** 12755 1727204115.95459: entering _queue_task() for managed-node1/command 12755 1727204115.95804: worker is 1 (out of 1 available) 12755 1727204115.95819: exiting _queue_task() for managed-node1/command 12755 1727204115.95833: done queuing things up, now waiting for results queue to drain 12755 1727204115.95835: waiting for pending results... 12755 1727204115.96312: running TaskExecutor() for managed-node1/TASK: From the active connection, get the port2 profile "bond0.1" 12755 1727204115.96317: in run() - task 12b410aa-8751-72e9-1a19-0000000000c1 12755 1727204115.96321: variable 'ansible_search_path' from source: unknown 12755 1727204115.96342: calling self._execute() 12755 1727204115.96477: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204115.96495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204115.96517: variable 'omit' from source: magic vars 12755 1727204115.96978: variable 'ansible_distribution_major_version' from source: facts 12755 1727204115.97002: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204115.97170: variable 'network_provider' from source: set_fact 12755 1727204115.97174: Evaluated conditional (network_provider == "nm"): True 12755 1727204115.97178: variable 'omit' from source: magic vars 12755 1727204115.97208: variable 'omit' from source: magic vars 12755 1727204115.97389: variable 'port2_profile' from source: play vars 12755 1727204115.97393: variable 'omit' from source: magic vars 12755 1727204115.97409: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204115.97455: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204115.97483: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204115.97513: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204115.97535: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204115.97578: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204115.97588: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204115.97599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204115.97736: Set connection var ansible_connection to ssh 12755 1727204115.97826: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204115.97830: Set connection var ansible_shell_type to sh 12755 1727204115.97832: Set connection var ansible_timeout to 10 12755 1727204115.97834: Set connection var ansible_shell_executable to /bin/sh 12755 1727204115.97837: Set connection var ansible_pipelining to False 12755 1727204115.97839: variable 'ansible_shell_executable' from source: unknown 12755 1727204115.97842: variable 'ansible_connection' from source: unknown 12755 1727204115.97844: variable 'ansible_module_compression' from source: unknown 12755 1727204115.97846: variable 'ansible_shell_type' from source: unknown 12755 1727204115.97848: variable 'ansible_shell_executable' from source: unknown 12755 1727204115.97853: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204115.97863: variable 'ansible_pipelining' from source: unknown 12755 1727204115.97870: variable 'ansible_timeout' from source: unknown 12755 1727204115.97882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204115.98061: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204115.98082: variable 'omit' from source: magic vars 12755 1727204115.98095: starting attempt loop 12755 1727204115.98102: running the handler 12755 1727204115.98124: _low_level_execute_command(): starting 12755 1727204115.98137: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204115.98901: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204115.98922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204115.98939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204115.99057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204115.99080: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204115.99168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204116.01011: stdout chunk (state=3): >>>/root <<< 12755 1727204116.01223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204116.01226: stdout chunk (state=3): >>><<< 12755 1727204116.01229: stderr chunk (state=3): >>><<< 12755 1727204116.01252: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204116.01274: _low_level_execute_command(): starting 12755 1727204116.01286: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204116.012595-15225-20290852448472 `" && echo ansible-tmp-1727204116.012595-15225-20290852448472="` echo /root/.ansible/tmp/ansible-tmp-1727204116.012595-15225-20290852448472 `" ) && sleep 0' 12755 1727204116.01964: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204116.01980: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204116.02007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204116.02076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204116.02146: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204116.02174: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204116.02218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204116.02280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204116.04484: stdout chunk (state=3): >>>ansible-tmp-1727204116.012595-15225-20290852448472=/root/.ansible/tmp/ansible-tmp-1727204116.012595-15225-20290852448472 <<< 12755 1727204116.04698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204116.04702: stdout chunk (state=3): >>><<< 12755 1727204116.04704: stderr chunk (state=3): >>><<< 12755 1727204116.04895: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204116.012595-15225-20290852448472=/root/.ansible/tmp/ansible-tmp-1727204116.012595-15225-20290852448472 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204116.04899: variable 'ansible_module_compression' from source: unknown 12755 1727204116.04902: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12755 1727204116.04904: variable 'ansible_facts' from source: unknown 12755 1727204116.04983: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204116.012595-15225-20290852448472/AnsiballZ_command.py 12755 1727204116.05152: Sending initial data 12755 1727204116.05254: Sent initial data (154 bytes) 12755 1727204116.05859: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204116.05876: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204116.05905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204116.06024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204116.06059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204116.06142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204116.07913: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204116.07971: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204116.08039: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpm896hw_a /root/.ansible/tmp/ansible-tmp-1727204116.012595-15225-20290852448472/AnsiballZ_command.py <<< 12755 1727204116.08055: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204116.012595-15225-20290852448472/AnsiballZ_command.py" <<< 12755 1727204116.08095: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpm896hw_a" to remote "/root/.ansible/tmp/ansible-tmp-1727204116.012595-15225-20290852448472/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204116.012595-15225-20290852448472/AnsiballZ_command.py" <<< 12755 1727204116.09239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204116.09348: stderr chunk (state=3): >>><<< 12755 1727204116.09362: stdout chunk (state=3): >>><<< 12755 1727204116.09399: done transferring module to remote 12755 1727204116.09416: _low_level_execute_command(): starting 12755 1727204116.09427: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204116.012595-15225-20290852448472/ /root/.ansible/tmp/ansible-tmp-1727204116.012595-15225-20290852448472/AnsiballZ_command.py && sleep 0' 12755 1727204116.10012: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204116.10037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204116.10041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204116.10092: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204116.10110: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204116.10154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204116.12266: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204116.12285: stderr chunk (state=3): >>><<< 12755 1727204116.12307: stdout chunk (state=3): >>><<< 12755 1727204116.12324: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204116.12331: _low_level_execute_command(): starting 12755 1727204116.12339: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204116.012595-15225-20290852448472/AnsiballZ_command.py && sleep 0' 12755 1727204116.13028: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204116.13040: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204116.13091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204116.13095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204116.13148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204116.33911: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.1"], "start": "2024-09-24 14:55:16.318346", "end": "2024-09-24 14:55:16.338257", "delta": "0:00:00.019911", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12755 1727204116.35728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204116.35781: stderr chunk (state=3): >>><<< 12755 1727204116.35785: stdout chunk (state=3): >>><<< 12755 1727204116.35805: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.1"], "start": "2024-09-24 14:55:16.318346", "end": "2024-09-24 14:55:16.338257", "delta": "0:00:00.019911", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204116.35846: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli c show --active bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204116.012595-15225-20290852448472/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204116.35857: _low_level_execute_command(): starting 12755 1727204116.35863: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204116.012595-15225-20290852448472/ > /dev/null 2>&1 && sleep 0' 12755 1727204116.36334: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204116.36338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204116.36341: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204116.36343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204116.36399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204116.36408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204116.36471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204116.38552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204116.38556: stdout chunk (state=3): >>><<< 12755 1727204116.38799: stderr chunk (state=3): >>><<< 12755 1727204116.38802: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204116.38809: handler run complete 12755 1727204116.38811: Evaluated conditional (False): False 12755 1727204116.38813: attempt loop complete, returning result 12755 1727204116.38815: _execute() done 12755 1727204116.38816: dumping result to json 12755 1727204116.38820: done dumping result, returning 12755 1727204116.38822: done running TaskExecutor() for managed-node1/TASK: From the active connection, get the port2 profile "bond0.1" [12b410aa-8751-72e9-1a19-0000000000c1] 12755 1727204116.38824: sending task result for task 12b410aa-8751-72e9-1a19-0000000000c1 12755 1727204116.38897: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000c1 12755 1727204116.38901: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "nmcli", "c", "show", "--active", "bond0.1" ], "delta": "0:00:00.019911", "end": "2024-09-24 14:55:16.338257", "rc": 0, "start": "2024-09-24 14:55:16.318346" } 12755 1727204116.39109: no more pending results, returning what we have 12755 1727204116.39113: results queue empty 12755 1727204116.39114: checking for any_errors_fatal 12755 1727204116.39124: done checking for any_errors_fatal 12755 1727204116.39125: checking for max_fail_percentage 12755 1727204116.39127: done checking for max_fail_percentage 12755 1727204116.39128: checking to see if all hosts have failed and the running result is not ok 12755 1727204116.39130: done checking to see if all hosts have failed 12755 1727204116.39131: getting the remaining hosts for this loop 12755 1727204116.39132: done getting the remaining hosts for this loop 12755 1727204116.39136: getting the next task for host managed-node1 12755 1727204116.39142: done getting next task for host managed-node1 12755 1727204116.39145: ^ task is: TASK: Assert that the port1 profile is not activated 12755 1727204116.39147: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204116.39152: getting variables 12755 1727204116.39154: in VariableManager get_vars() 12755 1727204116.39299: Calling all_inventory to load vars for managed-node1 12755 1727204116.39303: Calling groups_inventory to load vars for managed-node1 12755 1727204116.39306: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204116.39318: Calling all_plugins_play to load vars for managed-node1 12755 1727204116.39321: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204116.39325: Calling groups_plugins_play to load vars for managed-node1 12755 1727204116.41558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204116.44712: done with get_vars() 12755 1727204116.44754: done getting variables 12755 1727204116.44833: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the port1 profile is not activated] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:118 Tuesday 24 September 2024 14:55:16 -0400 (0:00:00.494) 0:00:41.684 ***** 12755 1727204116.44868: entering _queue_task() for managed-node1/assert 12755 1727204116.45328: worker is 1 (out of 1 available) 12755 1727204116.45342: exiting _queue_task() for managed-node1/assert 12755 1727204116.45354: done queuing things up, now waiting for results queue to drain 12755 1727204116.45356: waiting for pending results... 12755 1727204116.45611: running TaskExecutor() for managed-node1/TASK: Assert that the port1 profile is not activated 12755 1727204116.45729: in run() - task 12b410aa-8751-72e9-1a19-0000000000c2 12755 1727204116.45745: variable 'ansible_search_path' from source: unknown 12755 1727204116.45792: calling self._execute() 12755 1727204116.45929: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204116.45937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204116.45950: variable 'omit' from source: magic vars 12755 1727204116.46436: variable 'ansible_distribution_major_version' from source: facts 12755 1727204116.46450: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204116.46610: variable 'network_provider' from source: set_fact 12755 1727204116.46618: Evaluated conditional (network_provider == "nm"): True 12755 1727204116.46629: variable 'omit' from source: magic vars 12755 1727204116.46663: variable 'omit' from source: magic vars 12755 1727204116.46794: variable 'port1_profile' from source: play vars 12755 1727204116.46894: variable 'omit' from source: magic vars 12755 1727204116.46898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204116.46915: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204116.46943: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204116.46972: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204116.46990: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204116.47031: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204116.47035: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204116.47038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204116.47172: Set connection var ansible_connection to ssh 12755 1727204116.47187: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204116.47192: Set connection var ansible_shell_type to sh 12755 1727204116.47209: Set connection var ansible_timeout to 10 12755 1727204116.47217: Set connection var ansible_shell_executable to /bin/sh 12755 1727204116.47398: Set connection var ansible_pipelining to False 12755 1727204116.47402: variable 'ansible_shell_executable' from source: unknown 12755 1727204116.47405: variable 'ansible_connection' from source: unknown 12755 1727204116.47408: variable 'ansible_module_compression' from source: unknown 12755 1727204116.47410: variable 'ansible_shell_type' from source: unknown 12755 1727204116.47412: variable 'ansible_shell_executable' from source: unknown 12755 1727204116.47415: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204116.47417: variable 'ansible_pipelining' from source: unknown 12755 1727204116.47419: variable 'ansible_timeout' from source: unknown 12755 1727204116.47421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204116.47468: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204116.47480: variable 'omit' from source: magic vars 12755 1727204116.47486: starting attempt loop 12755 1727204116.47491: running the handler 12755 1727204116.47708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204116.50947: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204116.51025: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204116.51078: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204116.51127: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204116.51158: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204116.51249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204116.51283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204116.51323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204116.51376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204116.51393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204116.51522: variable 'active_port1_profile' from source: set_fact 12755 1727204116.51557: Evaluated conditional (active_port1_profile.stdout | length == 0): True 12755 1727204116.51567: handler run complete 12755 1727204116.51741: attempt loop complete, returning result 12755 1727204116.51744: _execute() done 12755 1727204116.51747: dumping result to json 12755 1727204116.51749: done dumping result, returning 12755 1727204116.51751: done running TaskExecutor() for managed-node1/TASK: Assert that the port1 profile is not activated [12b410aa-8751-72e9-1a19-0000000000c2] 12755 1727204116.51755: sending task result for task 12b410aa-8751-72e9-1a19-0000000000c2 12755 1727204116.51835: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000c2 12755 1727204116.51839: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12755 1727204116.51905: no more pending results, returning what we have 12755 1727204116.51909: results queue empty 12755 1727204116.51910: checking for any_errors_fatal 12755 1727204116.51922: done checking for any_errors_fatal 12755 1727204116.51923: checking for max_fail_percentage 12755 1727204116.51925: done checking for max_fail_percentage 12755 1727204116.51926: checking to see if all hosts have failed and the running result is not ok 12755 1727204116.51928: done checking to see if all hosts have failed 12755 1727204116.51929: getting the remaining hosts for this loop 12755 1727204116.51931: done getting the remaining hosts for this loop 12755 1727204116.51936: getting the next task for host managed-node1 12755 1727204116.51944: done getting next task for host managed-node1 12755 1727204116.51948: ^ task is: TASK: Assert that the port2 profile is not activated 12755 1727204116.51951: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204116.51955: getting variables 12755 1727204116.51965: in VariableManager get_vars() 12755 1727204116.52037: Calling all_inventory to load vars for managed-node1 12755 1727204116.52041: Calling groups_inventory to load vars for managed-node1 12755 1727204116.52044: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204116.52059: Calling all_plugins_play to load vars for managed-node1 12755 1727204116.52063: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204116.52068: Calling groups_plugins_play to load vars for managed-node1 12755 1727204116.55891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204116.60966: done with get_vars() 12755 1727204116.60999: done getting variables 12755 1727204116.61053: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the port2 profile is not activated] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:125 Tuesday 24 September 2024 14:55:16 -0400 (0:00:00.162) 0:00:41.846 ***** 12755 1727204116.61077: entering _queue_task() for managed-node1/assert 12755 1727204116.61345: worker is 1 (out of 1 available) 12755 1727204116.61359: exiting _queue_task() for managed-node1/assert 12755 1727204116.61371: done queuing things up, now waiting for results queue to drain 12755 1727204116.61373: waiting for pending results... 12755 1727204116.61574: running TaskExecutor() for managed-node1/TASK: Assert that the port2 profile is not activated 12755 1727204116.61649: in run() - task 12b410aa-8751-72e9-1a19-0000000000c3 12755 1727204116.61672: variable 'ansible_search_path' from source: unknown 12755 1727204116.61708: calling self._execute() 12755 1727204116.61803: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204116.61813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204116.61830: variable 'omit' from source: magic vars 12755 1727204116.62161: variable 'ansible_distribution_major_version' from source: facts 12755 1727204116.62173: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204116.62273: variable 'network_provider' from source: set_fact 12755 1727204116.62279: Evaluated conditional (network_provider == "nm"): True 12755 1727204116.62286: variable 'omit' from source: magic vars 12755 1727204116.62310: variable 'omit' from source: magic vars 12755 1727204116.62411: variable 'port2_profile' from source: play vars 12755 1727204116.62435: variable 'omit' from source: magic vars 12755 1727204116.62520: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204116.62582: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204116.62585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204116.62598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204116.62614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204116.62655: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204116.62658: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204116.62661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204116.62829: Set connection var ansible_connection to ssh 12755 1727204116.62832: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204116.62835: Set connection var ansible_shell_type to sh 12755 1727204116.62837: Set connection var ansible_timeout to 10 12755 1727204116.62839: Set connection var ansible_shell_executable to /bin/sh 12755 1727204116.62842: Set connection var ansible_pipelining to False 12755 1727204116.62849: variable 'ansible_shell_executable' from source: unknown 12755 1727204116.62855: variable 'ansible_connection' from source: unknown 12755 1727204116.62857: variable 'ansible_module_compression' from source: unknown 12755 1727204116.62863: variable 'ansible_shell_type' from source: unknown 12755 1727204116.62866: variable 'ansible_shell_executable' from source: unknown 12755 1727204116.62871: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204116.62877: variable 'ansible_pipelining' from source: unknown 12755 1727204116.62880: variable 'ansible_timeout' from source: unknown 12755 1727204116.62894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204116.63159: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204116.63163: variable 'omit' from source: magic vars 12755 1727204116.63166: starting attempt loop 12755 1727204116.63170: running the handler 12755 1727204116.63385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204116.65364: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204116.65431: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204116.65461: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204116.65497: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204116.65520: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204116.65580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204116.65607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204116.65635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204116.65666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204116.65680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204116.65773: variable 'active_port2_profile' from source: set_fact 12755 1727204116.65793: Evaluated conditional (active_port2_profile.stdout | length == 0): True 12755 1727204116.65815: handler run complete 12755 1727204116.65839: attempt loop complete, returning result 12755 1727204116.65843: _execute() done 12755 1727204116.65846: dumping result to json 12755 1727204116.65848: done dumping result, returning 12755 1727204116.65851: done running TaskExecutor() for managed-node1/TASK: Assert that the port2 profile is not activated [12b410aa-8751-72e9-1a19-0000000000c3] 12755 1727204116.65853: sending task result for task 12b410aa-8751-72e9-1a19-0000000000c3 12755 1727204116.65955: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000c3 12755 1727204116.65959: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12755 1727204116.66047: no more pending results, returning what we have 12755 1727204116.66050: results queue empty 12755 1727204116.66052: checking for any_errors_fatal 12755 1727204116.66058: done checking for any_errors_fatal 12755 1727204116.66059: checking for max_fail_percentage 12755 1727204116.66061: done checking for max_fail_percentage 12755 1727204116.66062: checking to see if all hosts have failed and the running result is not ok 12755 1727204116.66063: done checking to see if all hosts have failed 12755 1727204116.66064: getting the remaining hosts for this loop 12755 1727204116.66066: done getting the remaining hosts for this loop 12755 1727204116.66071: getting the next task for host managed-node1 12755 1727204116.66077: done getting next task for host managed-node1 12755 1727204116.66080: ^ task is: TASK: Get the port1 device state 12755 1727204116.66082: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204116.66086: getting variables 12755 1727204116.66096: in VariableManager get_vars() 12755 1727204116.66154: Calling all_inventory to load vars for managed-node1 12755 1727204116.66157: Calling groups_inventory to load vars for managed-node1 12755 1727204116.66160: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204116.66172: Calling all_plugins_play to load vars for managed-node1 12755 1727204116.66175: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204116.66178: Calling groups_plugins_play to load vars for managed-node1 12755 1727204116.68766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204116.71920: done with get_vars() 12755 1727204116.71969: done getting variables 12755 1727204116.72050: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the port1 device state] ********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:132 Tuesday 24 September 2024 14:55:16 -0400 (0:00:00.110) 0:00:41.956 ***** 12755 1727204116.72086: entering _queue_task() for managed-node1/command 12755 1727204116.72600: worker is 1 (out of 1 available) 12755 1727204116.72612: exiting _queue_task() for managed-node1/command 12755 1727204116.72624: done queuing things up, now waiting for results queue to drain 12755 1727204116.72626: waiting for pending results... 12755 1727204116.72933: running TaskExecutor() for managed-node1/TASK: Get the port1 device state 12755 1727204116.73008: in run() - task 12b410aa-8751-72e9-1a19-0000000000c4 12755 1727204116.73028: variable 'ansible_search_path' from source: unknown 12755 1727204116.73116: calling self._execute() 12755 1727204116.73200: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204116.73219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204116.73254: variable 'omit' from source: magic vars 12755 1727204116.73774: variable 'ansible_distribution_major_version' from source: facts 12755 1727204116.73788: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204116.73953: variable 'network_provider' from source: set_fact 12755 1727204116.73992: Evaluated conditional (network_provider == "initscripts"): False 12755 1727204116.74001: when evaluation is False, skipping this task 12755 1727204116.74004: _execute() done 12755 1727204116.74011: dumping result to json 12755 1727204116.74014: done dumping result, returning 12755 1727204116.74113: done running TaskExecutor() for managed-node1/TASK: Get the port1 device state [12b410aa-8751-72e9-1a19-0000000000c4] 12755 1727204116.74118: sending task result for task 12b410aa-8751-72e9-1a19-0000000000c4 12755 1727204116.74194: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000c4 12755 1727204116.74198: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12755 1727204116.74264: no more pending results, returning what we have 12755 1727204116.74268: results queue empty 12755 1727204116.74270: checking for any_errors_fatal 12755 1727204116.74280: done checking for any_errors_fatal 12755 1727204116.74281: checking for max_fail_percentage 12755 1727204116.74283: done checking for max_fail_percentage 12755 1727204116.74284: checking to see if all hosts have failed and the running result is not ok 12755 1727204116.74286: done checking to see if all hosts have failed 12755 1727204116.74287: getting the remaining hosts for this loop 12755 1727204116.74291: done getting the remaining hosts for this loop 12755 1727204116.74296: getting the next task for host managed-node1 12755 1727204116.74304: done getting next task for host managed-node1 12755 1727204116.74309: ^ task is: TASK: Get the port2 device state 12755 1727204116.74313: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204116.74319: getting variables 12755 1727204116.74321: in VariableManager get_vars() 12755 1727204116.74609: Calling all_inventory to load vars for managed-node1 12755 1727204116.74615: Calling groups_inventory to load vars for managed-node1 12755 1727204116.74619: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204116.74632: Calling all_plugins_play to load vars for managed-node1 12755 1727204116.74636: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204116.74640: Calling groups_plugins_play to load vars for managed-node1 12755 1727204116.76957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204116.80245: done with get_vars() 12755 1727204116.80307: done getting variables 12755 1727204116.80386: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the port2 device state] ********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:139 Tuesday 24 September 2024 14:55:16 -0400 (0:00:00.083) 0:00:42.039 ***** 12755 1727204116.80425: entering _queue_task() for managed-node1/command 12755 1727204116.80824: worker is 1 (out of 1 available) 12755 1727204116.80842: exiting _queue_task() for managed-node1/command 12755 1727204116.80856: done queuing things up, now waiting for results queue to drain 12755 1727204116.80858: waiting for pending results... 12755 1727204116.81411: running TaskExecutor() for managed-node1/TASK: Get the port2 device state 12755 1727204116.81417: in run() - task 12b410aa-8751-72e9-1a19-0000000000c5 12755 1727204116.81423: variable 'ansible_search_path' from source: unknown 12755 1727204116.81426: calling self._execute() 12755 1727204116.81506: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204116.81513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204116.81528: variable 'omit' from source: magic vars 12755 1727204116.81995: variable 'ansible_distribution_major_version' from source: facts 12755 1727204116.82009: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204116.82168: variable 'network_provider' from source: set_fact 12755 1727204116.82175: Evaluated conditional (network_provider == "initscripts"): False 12755 1727204116.82178: when evaluation is False, skipping this task 12755 1727204116.82184: _execute() done 12755 1727204116.82190: dumping result to json 12755 1727204116.82195: done dumping result, returning 12755 1727204116.82203: done running TaskExecutor() for managed-node1/TASK: Get the port2 device state [12b410aa-8751-72e9-1a19-0000000000c5] 12755 1727204116.82209: sending task result for task 12b410aa-8751-72e9-1a19-0000000000c5 12755 1727204116.82322: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000c5 12755 1727204116.82325: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12755 1727204116.82413: no more pending results, returning what we have 12755 1727204116.82418: results queue empty 12755 1727204116.82419: checking for any_errors_fatal 12755 1727204116.82428: done checking for any_errors_fatal 12755 1727204116.82429: checking for max_fail_percentage 12755 1727204116.82431: done checking for max_fail_percentage 12755 1727204116.82432: checking to see if all hosts have failed and the running result is not ok 12755 1727204116.82433: done checking to see if all hosts have failed 12755 1727204116.82434: getting the remaining hosts for this loop 12755 1727204116.82436: done getting the remaining hosts for this loop 12755 1727204116.82441: getting the next task for host managed-node1 12755 1727204116.82448: done getting next task for host managed-node1 12755 1727204116.82451: ^ task is: TASK: Assert that the port1 device is in DOWN state 12755 1727204116.82456: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204116.82461: getting variables 12755 1727204116.82463: in VariableManager get_vars() 12755 1727204116.82532: Calling all_inventory to load vars for managed-node1 12755 1727204116.82536: Calling groups_inventory to load vars for managed-node1 12755 1727204116.82539: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204116.82558: Calling all_plugins_play to load vars for managed-node1 12755 1727204116.82562: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204116.82567: Calling groups_plugins_play to load vars for managed-node1 12755 1727204116.85395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204116.88509: done with get_vars() 12755 1727204116.88547: done getting variables 12755 1727204116.88631: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the port1 device is in DOWN state] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:146 Tuesday 24 September 2024 14:55:16 -0400 (0:00:00.082) 0:00:42.122 ***** 12755 1727204116.88666: entering _queue_task() for managed-node1/assert 12755 1727204116.89167: worker is 1 (out of 1 available) 12755 1727204116.89181: exiting _queue_task() for managed-node1/assert 12755 1727204116.89196: done queuing things up, now waiting for results queue to drain 12755 1727204116.89198: waiting for pending results... 12755 1727204116.89544: running TaskExecutor() for managed-node1/TASK: Assert that the port1 device is in DOWN state 12755 1727204116.89596: in run() - task 12b410aa-8751-72e9-1a19-0000000000c6 12755 1727204116.89697: variable 'ansible_search_path' from source: unknown 12755 1727204116.89702: calling self._execute() 12755 1727204116.89799: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204116.89810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204116.89824: variable 'omit' from source: magic vars 12755 1727204116.90377: variable 'ansible_distribution_major_version' from source: facts 12755 1727204116.90393: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204116.90538: variable 'network_provider' from source: set_fact 12755 1727204116.90544: Evaluated conditional (network_provider == "initscripts"): False 12755 1727204116.90552: when evaluation is False, skipping this task 12755 1727204116.90560: _execute() done 12755 1727204116.90563: dumping result to json 12755 1727204116.90567: done dumping result, returning 12755 1727204116.90570: done running TaskExecutor() for managed-node1/TASK: Assert that the port1 device is in DOWN state [12b410aa-8751-72e9-1a19-0000000000c6] 12755 1727204116.90577: sending task result for task 12b410aa-8751-72e9-1a19-0000000000c6 skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12755 1727204116.90743: no more pending results, returning what we have 12755 1727204116.90747: results queue empty 12755 1727204116.90748: checking for any_errors_fatal 12755 1727204116.90759: done checking for any_errors_fatal 12755 1727204116.90759: checking for max_fail_percentage 12755 1727204116.90761: done checking for max_fail_percentage 12755 1727204116.90762: checking to see if all hosts have failed and the running result is not ok 12755 1727204116.90763: done checking to see if all hosts have failed 12755 1727204116.90764: getting the remaining hosts for this loop 12755 1727204116.90766: done getting the remaining hosts for this loop 12755 1727204116.90770: getting the next task for host managed-node1 12755 1727204116.90779: done getting next task for host managed-node1 12755 1727204116.90782: ^ task is: TASK: Assert that the port2 device is in DOWN state 12755 1727204116.90785: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204116.90790: getting variables 12755 1727204116.90792: in VariableManager get_vars() 12755 1727204116.90851: Calling all_inventory to load vars for managed-node1 12755 1727204116.90854: Calling groups_inventory to load vars for managed-node1 12755 1727204116.90857: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204116.90870: Calling all_plugins_play to load vars for managed-node1 12755 1727204116.90874: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204116.90877: Calling groups_plugins_play to load vars for managed-node1 12755 1727204116.91406: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000c6 12755 1727204116.91410: WORKER PROCESS EXITING 12755 1727204116.92336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204116.94888: done with get_vars() 12755 1727204116.94918: done getting variables 12755 1727204116.94968: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the port2 device is in DOWN state] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:153 Tuesday 24 September 2024 14:55:16 -0400 (0:00:00.063) 0:00:42.185 ***** 12755 1727204116.94995: entering _queue_task() for managed-node1/assert 12755 1727204116.95253: worker is 1 (out of 1 available) 12755 1727204116.95268: exiting _queue_task() for managed-node1/assert 12755 1727204116.95283: done queuing things up, now waiting for results queue to drain 12755 1727204116.95284: waiting for pending results... 12755 1727204116.95482: running TaskExecutor() for managed-node1/TASK: Assert that the port2 device is in DOWN state 12755 1727204116.95571: in run() - task 12b410aa-8751-72e9-1a19-0000000000c7 12755 1727204116.95584: variable 'ansible_search_path' from source: unknown 12755 1727204116.95624: calling self._execute() 12755 1727204116.95716: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204116.95724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204116.95733: variable 'omit' from source: magic vars 12755 1727204116.96064: variable 'ansible_distribution_major_version' from source: facts 12755 1727204116.96077: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204116.96172: variable 'network_provider' from source: set_fact 12755 1727204116.96178: Evaluated conditional (network_provider == "initscripts"): False 12755 1727204116.96181: when evaluation is False, skipping this task 12755 1727204116.96193: _execute() done 12755 1727204116.96198: dumping result to json 12755 1727204116.96201: done dumping result, returning 12755 1727204116.96205: done running TaskExecutor() for managed-node1/TASK: Assert that the port2 device is in DOWN state [12b410aa-8751-72e9-1a19-0000000000c7] 12755 1727204116.96210: sending task result for task 12b410aa-8751-72e9-1a19-0000000000c7 12755 1727204116.96316: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000c7 12755 1727204116.96321: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12755 1727204116.96528: no more pending results, returning what we have 12755 1727204116.96532: results queue empty 12755 1727204116.96534: checking for any_errors_fatal 12755 1727204116.96539: done checking for any_errors_fatal 12755 1727204116.96540: checking for max_fail_percentage 12755 1727204116.96542: done checking for max_fail_percentage 12755 1727204116.96543: checking to see if all hosts have failed and the running result is not ok 12755 1727204116.96544: done checking to see if all hosts have failed 12755 1727204116.96545: getting the remaining hosts for this loop 12755 1727204116.96547: done getting the remaining hosts for this loop 12755 1727204116.96550: getting the next task for host managed-node1 12755 1727204116.96559: done getting next task for host managed-node1 12755 1727204116.96563: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12755 1727204116.96566: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204116.96587: getting variables 12755 1727204116.96591: in VariableManager get_vars() 12755 1727204116.96645: Calling all_inventory to load vars for managed-node1 12755 1727204116.96649: Calling groups_inventory to load vars for managed-node1 12755 1727204116.96652: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204116.96663: Calling all_plugins_play to load vars for managed-node1 12755 1727204116.96666: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204116.96670: Calling groups_plugins_play to load vars for managed-node1 12755 1727204116.98202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204116.99783: done with get_vars() 12755 1727204116.99811: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:55:16 -0400 (0:00:00.048) 0:00:42.234 ***** 12755 1727204116.99896: entering _queue_task() for managed-node1/include_tasks 12755 1727204117.00151: worker is 1 (out of 1 available) 12755 1727204117.00168: exiting _queue_task() for managed-node1/include_tasks 12755 1727204117.00182: done queuing things up, now waiting for results queue to drain 12755 1727204117.00184: waiting for pending results... 12755 1727204117.00387: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12755 1727204117.00515: in run() - task 12b410aa-8751-72e9-1a19-0000000000cf 12755 1727204117.00533: variable 'ansible_search_path' from source: unknown 12755 1727204117.00537: variable 'ansible_search_path' from source: unknown 12755 1727204117.00569: calling self._execute() 12755 1727204117.00667: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204117.00675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204117.00685: variable 'omit' from source: magic vars 12755 1727204117.01013: variable 'ansible_distribution_major_version' from source: facts 12755 1727204117.01027: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204117.01034: _execute() done 12755 1727204117.01037: dumping result to json 12755 1727204117.01042: done dumping result, returning 12755 1727204117.01051: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-72e9-1a19-0000000000cf] 12755 1727204117.01057: sending task result for task 12b410aa-8751-72e9-1a19-0000000000cf 12755 1727204117.01158: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000cf 12755 1727204117.01161: WORKER PROCESS EXITING 12755 1727204117.01223: no more pending results, returning what we have 12755 1727204117.01228: in VariableManager get_vars() 12755 1727204117.01288: Calling all_inventory to load vars for managed-node1 12755 1727204117.01293: Calling groups_inventory to load vars for managed-node1 12755 1727204117.01296: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204117.01309: Calling all_plugins_play to load vars for managed-node1 12755 1727204117.01312: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204117.01316: Calling groups_plugins_play to load vars for managed-node1 12755 1727204117.02663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204117.04257: done with get_vars() 12755 1727204117.04278: variable 'ansible_search_path' from source: unknown 12755 1727204117.04279: variable 'ansible_search_path' from source: unknown 12755 1727204117.04320: we have included files to process 12755 1727204117.04321: generating all_blocks data 12755 1727204117.04323: done generating all_blocks data 12755 1727204117.04328: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12755 1727204117.04329: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12755 1727204117.04331: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12755 1727204117.04825: done processing included file 12755 1727204117.04827: iterating over new_blocks loaded from include file 12755 1727204117.04828: in VariableManager get_vars() 12755 1727204117.04859: done with get_vars() 12755 1727204117.04861: filtering new block on tags 12755 1727204117.04876: done filtering new block on tags 12755 1727204117.04878: in VariableManager get_vars() 12755 1727204117.04904: done with get_vars() 12755 1727204117.04905: filtering new block on tags 12755 1727204117.04923: done filtering new block on tags 12755 1727204117.04925: in VariableManager get_vars() 12755 1727204117.04949: done with get_vars() 12755 1727204117.04950: filtering new block on tags 12755 1727204117.04969: done filtering new block on tags 12755 1727204117.04971: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 12755 1727204117.04975: extending task lists for all hosts with included blocks 12755 1727204117.05642: done extending task lists 12755 1727204117.05643: done processing included files 12755 1727204117.05644: results queue empty 12755 1727204117.05645: checking for any_errors_fatal 12755 1727204117.05647: done checking for any_errors_fatal 12755 1727204117.05648: checking for max_fail_percentage 12755 1727204117.05649: done checking for max_fail_percentage 12755 1727204117.05649: checking to see if all hosts have failed and the running result is not ok 12755 1727204117.05650: done checking to see if all hosts have failed 12755 1727204117.05651: getting the remaining hosts for this loop 12755 1727204117.05652: done getting the remaining hosts for this loop 12755 1727204117.05654: getting the next task for host managed-node1 12755 1727204117.05657: done getting next task for host managed-node1 12755 1727204117.05659: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12755 1727204117.05662: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204117.05671: getting variables 12755 1727204117.05672: in VariableManager get_vars() 12755 1727204117.05688: Calling all_inventory to load vars for managed-node1 12755 1727204117.05692: Calling groups_inventory to load vars for managed-node1 12755 1727204117.05694: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204117.05699: Calling all_plugins_play to load vars for managed-node1 12755 1727204117.05701: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204117.05703: Calling groups_plugins_play to load vars for managed-node1 12755 1727204117.06866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204117.08442: done with get_vars() 12755 1727204117.08463: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:55:17 -0400 (0:00:00.086) 0:00:42.321 ***** 12755 1727204117.08530: entering _queue_task() for managed-node1/setup 12755 1727204117.08811: worker is 1 (out of 1 available) 12755 1727204117.08828: exiting _queue_task() for managed-node1/setup 12755 1727204117.08842: done queuing things up, now waiting for results queue to drain 12755 1727204117.08844: waiting for pending results... 12755 1727204117.09044: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12755 1727204117.09176: in run() - task 12b410aa-8751-72e9-1a19-000000000796 12755 1727204117.09191: variable 'ansible_search_path' from source: unknown 12755 1727204117.09197: variable 'ansible_search_path' from source: unknown 12755 1727204117.09228: calling self._execute() 12755 1727204117.09314: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204117.09323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204117.09331: variable 'omit' from source: magic vars 12755 1727204117.09645: variable 'ansible_distribution_major_version' from source: facts 12755 1727204117.09656: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204117.09844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204117.11524: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204117.11581: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204117.11617: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204117.11652: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204117.11675: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204117.11750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204117.11774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204117.11797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204117.11835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204117.11847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204117.11895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204117.11915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204117.11942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204117.11975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204117.11987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204117.12121: variable '__network_required_facts' from source: role '' defaults 12755 1727204117.12127: variable 'ansible_facts' from source: unknown 12755 1727204117.12816: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 12755 1727204117.12823: when evaluation is False, skipping this task 12755 1727204117.12826: _execute() done 12755 1727204117.12829: dumping result to json 12755 1727204117.12831: done dumping result, returning 12755 1727204117.12839: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-72e9-1a19-000000000796] 12755 1727204117.12844: sending task result for task 12b410aa-8751-72e9-1a19-000000000796 12755 1727204117.12940: done sending task result for task 12b410aa-8751-72e9-1a19-000000000796 12755 1727204117.12943: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204117.12992: no more pending results, returning what we have 12755 1727204117.12996: results queue empty 12755 1727204117.12997: checking for any_errors_fatal 12755 1727204117.12999: done checking for any_errors_fatal 12755 1727204117.12999: checking for max_fail_percentage 12755 1727204117.13001: done checking for max_fail_percentage 12755 1727204117.13002: checking to see if all hosts have failed and the running result is not ok 12755 1727204117.13003: done checking to see if all hosts have failed 12755 1727204117.13004: getting the remaining hosts for this loop 12755 1727204117.13006: done getting the remaining hosts for this loop 12755 1727204117.13011: getting the next task for host managed-node1 12755 1727204117.13023: done getting next task for host managed-node1 12755 1727204117.13027: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 12755 1727204117.13032: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204117.13057: getting variables 12755 1727204117.13059: in VariableManager get_vars() 12755 1727204117.13124: Calling all_inventory to load vars for managed-node1 12755 1727204117.13128: Calling groups_inventory to load vars for managed-node1 12755 1727204117.13131: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204117.13142: Calling all_plugins_play to load vars for managed-node1 12755 1727204117.13146: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204117.13150: Calling groups_plugins_play to load vars for managed-node1 12755 1727204117.14387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204117.15988: done with get_vars() 12755 1727204117.16013: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:55:17 -0400 (0:00:00.075) 0:00:42.396 ***** 12755 1727204117.16104: entering _queue_task() for managed-node1/stat 12755 1727204117.16365: worker is 1 (out of 1 available) 12755 1727204117.16380: exiting _queue_task() for managed-node1/stat 12755 1727204117.16396: done queuing things up, now waiting for results queue to drain 12755 1727204117.16398: waiting for pending results... 12755 1727204117.16615: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 12755 1727204117.16750: in run() - task 12b410aa-8751-72e9-1a19-000000000798 12755 1727204117.16764: variable 'ansible_search_path' from source: unknown 12755 1727204117.16768: variable 'ansible_search_path' from source: unknown 12755 1727204117.16801: calling self._execute() 12755 1727204117.16891: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204117.16898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204117.16909: variable 'omit' from source: magic vars 12755 1727204117.17232: variable 'ansible_distribution_major_version' from source: facts 12755 1727204117.17243: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204117.17389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204117.17613: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204117.17654: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204117.17682: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204117.17715: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204117.17788: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204117.17812: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204117.17841: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204117.17861: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204117.17941: variable '__network_is_ostree' from source: set_fact 12755 1727204117.17962: Evaluated conditional (not __network_is_ostree is defined): False 12755 1727204117.17965: when evaluation is False, skipping this task 12755 1727204117.17968: _execute() done 12755 1727204117.17971: dumping result to json 12755 1727204117.17973: done dumping result, returning 12755 1727204117.18131: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-72e9-1a19-000000000798] 12755 1727204117.18135: sending task result for task 12b410aa-8751-72e9-1a19-000000000798 12755 1727204117.18202: done sending task result for task 12b410aa-8751-72e9-1a19-000000000798 12755 1727204117.18205: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12755 1727204117.18251: no more pending results, returning what we have 12755 1727204117.18254: results queue empty 12755 1727204117.18255: checking for any_errors_fatal 12755 1727204117.18262: done checking for any_errors_fatal 12755 1727204117.18263: checking for max_fail_percentage 12755 1727204117.18265: done checking for max_fail_percentage 12755 1727204117.18266: checking to see if all hosts have failed and the running result is not ok 12755 1727204117.18267: done checking to see if all hosts have failed 12755 1727204117.18268: getting the remaining hosts for this loop 12755 1727204117.18270: done getting the remaining hosts for this loop 12755 1727204117.18274: getting the next task for host managed-node1 12755 1727204117.18280: done getting next task for host managed-node1 12755 1727204117.18284: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12755 1727204117.18288: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204117.18310: getting variables 12755 1727204117.18311: in VariableManager get_vars() 12755 1727204117.18364: Calling all_inventory to load vars for managed-node1 12755 1727204117.18367: Calling groups_inventory to load vars for managed-node1 12755 1727204117.18370: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204117.18381: Calling all_plugins_play to load vars for managed-node1 12755 1727204117.18384: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204117.18388: Calling groups_plugins_play to load vars for managed-node1 12755 1727204117.20120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204117.21992: done with get_vars() 12755 1727204117.22032: done getting variables 12755 1727204117.22103: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:55:17 -0400 (0:00:00.060) 0:00:42.457 ***** 12755 1727204117.22144: entering _queue_task() for managed-node1/set_fact 12755 1727204117.22486: worker is 1 (out of 1 available) 12755 1727204117.22502: exiting _queue_task() for managed-node1/set_fact 12755 1727204117.22515: done queuing things up, now waiting for results queue to drain 12755 1727204117.22516: waiting for pending results... 12755 1727204117.22911: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12755 1727204117.23058: in run() - task 12b410aa-8751-72e9-1a19-000000000799 12755 1727204117.23081: variable 'ansible_search_path' from source: unknown 12755 1727204117.23091: variable 'ansible_search_path' from source: unknown 12755 1727204117.23144: calling self._execute() 12755 1727204117.23334: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204117.23338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204117.23341: variable 'omit' from source: magic vars 12755 1727204117.23742: variable 'ansible_distribution_major_version' from source: facts 12755 1727204117.23763: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204117.23995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204117.24326: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204117.24390: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204117.24442: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204117.24485: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204117.24596: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204117.24692: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204117.24701: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204117.24720: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204117.24837: variable '__network_is_ostree' from source: set_fact 12755 1727204117.24850: Evaluated conditional (not __network_is_ostree is defined): False 12755 1727204117.24858: when evaluation is False, skipping this task 12755 1727204117.24865: _execute() done 12755 1727204117.24873: dumping result to json 12755 1727204117.24880: done dumping result, returning 12755 1727204117.24896: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-72e9-1a19-000000000799] 12755 1727204117.24906: sending task result for task 12b410aa-8751-72e9-1a19-000000000799 skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12755 1727204117.25073: no more pending results, returning what we have 12755 1727204117.25077: results queue empty 12755 1727204117.25079: checking for any_errors_fatal 12755 1727204117.25086: done checking for any_errors_fatal 12755 1727204117.25087: checking for max_fail_percentage 12755 1727204117.25090: done checking for max_fail_percentage 12755 1727204117.25091: checking to see if all hosts have failed and the running result is not ok 12755 1727204117.25093: done checking to see if all hosts have failed 12755 1727204117.25094: getting the remaining hosts for this loop 12755 1727204117.25096: done getting the remaining hosts for this loop 12755 1727204117.25101: getting the next task for host managed-node1 12755 1727204117.25113: done getting next task for host managed-node1 12755 1727204117.25120: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 12755 1727204117.25124: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204117.25152: getting variables 12755 1727204117.25154: in VariableManager get_vars() 12755 1727204117.25417: Calling all_inventory to load vars for managed-node1 12755 1727204117.25424: Calling groups_inventory to load vars for managed-node1 12755 1727204117.25427: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204117.25439: Calling all_plugins_play to load vars for managed-node1 12755 1727204117.25443: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204117.25447: Calling groups_plugins_play to load vars for managed-node1 12755 1727204117.26106: done sending task result for task 12b410aa-8751-72e9-1a19-000000000799 12755 1727204117.26110: WORKER PROCESS EXITING 12755 1727204117.27842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204117.30742: done with get_vars() 12755 1727204117.30784: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:55:17 -0400 (0:00:00.087) 0:00:42.544 ***** 12755 1727204117.30907: entering _queue_task() for managed-node1/service_facts 12755 1727204117.31276: worker is 1 (out of 1 available) 12755 1727204117.31494: exiting _queue_task() for managed-node1/service_facts 12755 1727204117.31507: done queuing things up, now waiting for results queue to drain 12755 1727204117.31508: waiting for pending results... 12755 1727204117.31634: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 12755 1727204117.31845: in run() - task 12b410aa-8751-72e9-1a19-00000000079b 12755 1727204117.31874: variable 'ansible_search_path' from source: unknown 12755 1727204117.31884: variable 'ansible_search_path' from source: unknown 12755 1727204117.31934: calling self._execute() 12755 1727204117.32057: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204117.32077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204117.32099: variable 'omit' from source: magic vars 12755 1727204117.32583: variable 'ansible_distribution_major_version' from source: facts 12755 1727204117.32605: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204117.32621: variable 'omit' from source: magic vars 12755 1727204117.32732: variable 'omit' from source: magic vars 12755 1727204117.32784: variable 'omit' from source: magic vars 12755 1727204117.32841: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204117.32892: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204117.32923: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204117.32949: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204117.32975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204117.33016: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204117.33076: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204117.33080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204117.33171: Set connection var ansible_connection to ssh 12755 1727204117.33193: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204117.33202: Set connection var ansible_shell_type to sh 12755 1727204117.33224: Set connection var ansible_timeout to 10 12755 1727204117.33237: Set connection var ansible_shell_executable to /bin/sh 12755 1727204117.33248: Set connection var ansible_pipelining to False 12755 1727204117.33277: variable 'ansible_shell_executable' from source: unknown 12755 1727204117.33286: variable 'ansible_connection' from source: unknown 12755 1727204117.33395: variable 'ansible_module_compression' from source: unknown 12755 1727204117.33399: variable 'ansible_shell_type' from source: unknown 12755 1727204117.33401: variable 'ansible_shell_executable' from source: unknown 12755 1727204117.33403: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204117.33405: variable 'ansible_pipelining' from source: unknown 12755 1727204117.33408: variable 'ansible_timeout' from source: unknown 12755 1727204117.33410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204117.33580: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204117.33635: variable 'omit' from source: magic vars 12755 1727204117.33638: starting attempt loop 12755 1727204117.33641: running the handler 12755 1727204117.33644: _low_level_execute_command(): starting 12755 1727204117.33653: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204117.34528: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204117.34534: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204117.34579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204117.36461: stdout chunk (state=3): >>>/root <<< 12755 1727204117.36687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204117.36694: stdout chunk (state=3): >>><<< 12755 1727204117.36697: stderr chunk (state=3): >>><<< 12755 1727204117.36722: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204117.36747: _low_level_execute_command(): starting 12755 1727204117.36859: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204117.3673072-15265-111483674093933 `" && echo ansible-tmp-1727204117.3673072-15265-111483674093933="` echo /root/.ansible/tmp/ansible-tmp-1727204117.3673072-15265-111483674093933 `" ) && sleep 0' 12755 1727204117.37381: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204117.37497: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204117.37565: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204117.37614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204117.39747: stdout chunk (state=3): >>>ansible-tmp-1727204117.3673072-15265-111483674093933=/root/.ansible/tmp/ansible-tmp-1727204117.3673072-15265-111483674093933 <<< 12755 1727204117.40021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204117.40025: stdout chunk (state=3): >>><<< 12755 1727204117.40028: stderr chunk (state=3): >>><<< 12755 1727204117.40195: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204117.3673072-15265-111483674093933=/root/.ansible/tmp/ansible-tmp-1727204117.3673072-15265-111483674093933 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204117.40199: variable 'ansible_module_compression' from source: unknown 12755 1727204117.40201: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 12755 1727204117.40204: variable 'ansible_facts' from source: unknown 12755 1727204117.40302: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204117.3673072-15265-111483674093933/AnsiballZ_service_facts.py 12755 1727204117.40567: Sending initial data 12755 1727204117.40570: Sent initial data (162 bytes) 12755 1727204117.41285: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204117.41307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204117.41323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204117.41349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204117.41405: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204117.41476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204117.41497: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204117.41520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204117.41759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204117.43468: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204117.43530: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204117.43598: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpln_6t4j4 /root/.ansible/tmp/ansible-tmp-1727204117.3673072-15265-111483674093933/AnsiballZ_service_facts.py <<< 12755 1727204117.43602: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204117.3673072-15265-111483674093933/AnsiballZ_service_facts.py" <<< 12755 1727204117.43673: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpln_6t4j4" to remote "/root/.ansible/tmp/ansible-tmp-1727204117.3673072-15265-111483674093933/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204117.3673072-15265-111483674093933/AnsiballZ_service_facts.py" <<< 12755 1727204117.44903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204117.44988: stderr chunk (state=3): >>><<< 12755 1727204117.44993: stdout chunk (state=3): >>><<< 12755 1727204117.44996: done transferring module to remote 12755 1727204117.45004: _low_level_execute_command(): starting 12755 1727204117.45014: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204117.3673072-15265-111483674093933/ /root/.ansible/tmp/ansible-tmp-1727204117.3673072-15265-111483674093933/AnsiballZ_service_facts.py && sleep 0' 12755 1727204117.46012: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204117.46123: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204117.46141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204117.46161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204117.46230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204117.46384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204117.46443: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204117.46661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204117.50066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204117.50318: stderr chunk (state=3): >>><<< 12755 1727204117.50322: stdout chunk (state=3): >>><<< 12755 1727204117.50383: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204117.50396: _low_level_execute_command(): starting 12755 1727204117.50408: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204117.3673072-15265-111483674093933/AnsiballZ_service_facts.py && sleep 0' 12755 1727204117.51984: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204117.52328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204117.52488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204119.68539: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name"<<< 12755 1727204119.68731: stdout chunk (state=3): >>>: "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 12755 1727204119.70483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204119.70505: stderr chunk (state=3): >>><<< 12755 1727204119.70515: stdout chunk (state=3): >>><<< 12755 1727204119.70559: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204119.71866: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204117.3673072-15265-111483674093933/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204119.71871: _low_level_execute_command(): starting 12755 1727204119.71874: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204117.3673072-15265-111483674093933/ > /dev/null 2>&1 && sleep 0' 12755 1727204119.72472: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204119.72491: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204119.72507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204119.72533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204119.72550: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204119.72562: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204119.72669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204119.72701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204119.72778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204119.75359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204119.75371: stdout chunk (state=3): >>><<< 12755 1727204119.75385: stderr chunk (state=3): >>><<< 12755 1727204119.75420: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204119.75747: handler run complete 12755 1727204119.76596: variable 'ansible_facts' from source: unknown 12755 1727204119.76995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204119.78360: variable 'ansible_facts' from source: unknown 12755 1727204119.78814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204119.79577: attempt loop complete, returning result 12755 1727204119.79808: _execute() done 12755 1727204119.79817: dumping result to json 12755 1727204119.79909: done dumping result, returning 12755 1727204119.79930: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-72e9-1a19-00000000079b] 12755 1727204119.79940: sending task result for task 12b410aa-8751-72e9-1a19-00000000079b ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204119.82024: no more pending results, returning what we have 12755 1727204119.82028: results queue empty 12755 1727204119.82030: checking for any_errors_fatal 12755 1727204119.82035: done checking for any_errors_fatal 12755 1727204119.82036: checking for max_fail_percentage 12755 1727204119.82038: done checking for max_fail_percentage 12755 1727204119.82039: checking to see if all hosts have failed and the running result is not ok 12755 1727204119.82040: done checking to see if all hosts have failed 12755 1727204119.82041: getting the remaining hosts for this loop 12755 1727204119.82042: done getting the remaining hosts for this loop 12755 1727204119.82046: getting the next task for host managed-node1 12755 1727204119.82052: done getting next task for host managed-node1 12755 1727204119.82056: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 12755 1727204119.82060: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204119.82074: getting variables 12755 1727204119.82076: in VariableManager get_vars() 12755 1727204119.82266: Calling all_inventory to load vars for managed-node1 12755 1727204119.82270: Calling groups_inventory to load vars for managed-node1 12755 1727204119.82273: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204119.82288: Calling all_plugins_play to load vars for managed-node1 12755 1727204119.82294: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204119.82300: Calling groups_plugins_play to load vars for managed-node1 12755 1727204119.82824: done sending task result for task 12b410aa-8751-72e9-1a19-00000000079b 12755 1727204119.83341: WORKER PROCESS EXITING 12755 1727204119.85055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204119.88048: done with get_vars() 12755 1727204119.88095: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:55:19 -0400 (0:00:02.573) 0:00:45.117 ***** 12755 1727204119.88220: entering _queue_task() for managed-node1/package_facts 12755 1727204119.88609: worker is 1 (out of 1 available) 12755 1727204119.88625: exiting _queue_task() for managed-node1/package_facts 12755 1727204119.88641: done queuing things up, now waiting for results queue to drain 12755 1727204119.88643: waiting for pending results... 12755 1727204119.88978: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 12755 1727204119.89200: in run() - task 12b410aa-8751-72e9-1a19-00000000079c 12755 1727204119.89231: variable 'ansible_search_path' from source: unknown 12755 1727204119.89295: variable 'ansible_search_path' from source: unknown 12755 1727204119.89300: calling self._execute() 12755 1727204119.89413: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204119.89433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204119.89456: variable 'omit' from source: magic vars 12755 1727204119.89917: variable 'ansible_distribution_major_version' from source: facts 12755 1727204119.89941: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204119.89955: variable 'omit' from source: magic vars 12755 1727204119.90058: variable 'omit' from source: magic vars 12755 1727204119.90294: variable 'omit' from source: magic vars 12755 1727204119.90298: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204119.90301: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204119.90303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204119.90306: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204119.90308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204119.90332: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204119.90343: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204119.90353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204119.90487: Set connection var ansible_connection to ssh 12755 1727204119.90503: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204119.90512: Set connection var ansible_shell_type to sh 12755 1727204119.90535: Set connection var ansible_timeout to 10 12755 1727204119.90547: Set connection var ansible_shell_executable to /bin/sh 12755 1727204119.90559: Set connection var ansible_pipelining to False 12755 1727204119.90592: variable 'ansible_shell_executable' from source: unknown 12755 1727204119.90606: variable 'ansible_connection' from source: unknown 12755 1727204119.90695: variable 'ansible_module_compression' from source: unknown 12755 1727204119.90698: variable 'ansible_shell_type' from source: unknown 12755 1727204119.90701: variable 'ansible_shell_executable' from source: unknown 12755 1727204119.90705: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204119.90707: variable 'ansible_pipelining' from source: unknown 12755 1727204119.90709: variable 'ansible_timeout' from source: unknown 12755 1727204119.90712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204119.90912: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204119.90940: variable 'omit' from source: magic vars 12755 1727204119.90951: starting attempt loop 12755 1727204119.90960: running the handler 12755 1727204119.90981: _low_level_execute_command(): starting 12755 1727204119.91045: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204119.91794: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204119.91814: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204119.91946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 12755 1727204119.91964: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204119.91991: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204119.92070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204119.93930: stdout chunk (state=3): >>>/root <<< 12755 1727204119.94168: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204119.94172: stdout chunk (state=3): >>><<< 12755 1727204119.94174: stderr chunk (state=3): >>><<< 12755 1727204119.94206: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204119.94231: _low_level_execute_command(): starting 12755 1727204119.94244: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204119.9421432-15334-54159668046406 `" && echo ansible-tmp-1727204119.9421432-15334-54159668046406="` echo /root/.ansible/tmp/ansible-tmp-1727204119.9421432-15334-54159668046406 `" ) && sleep 0' 12755 1727204119.94920: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204119.94943: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204119.94959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204119.94977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204119.95012: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204119.95061: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204119.95172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204119.95185: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204119.95272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204119.97409: stdout chunk (state=3): >>>ansible-tmp-1727204119.9421432-15334-54159668046406=/root/.ansible/tmp/ansible-tmp-1727204119.9421432-15334-54159668046406 <<< 12755 1727204119.97609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204119.97613: stdout chunk (state=3): >>><<< 12755 1727204119.97626: stderr chunk (state=3): >>><<< 12755 1727204119.97796: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204119.9421432-15334-54159668046406=/root/.ansible/tmp/ansible-tmp-1727204119.9421432-15334-54159668046406 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204119.97801: variable 'ansible_module_compression' from source: unknown 12755 1727204119.97803: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 12755 1727204119.97821: variable 'ansible_facts' from source: unknown 12755 1727204119.98053: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204119.9421432-15334-54159668046406/AnsiballZ_package_facts.py 12755 1727204119.98313: Sending initial data 12755 1727204119.98317: Sent initial data (161 bytes) 12755 1727204119.98852: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204119.98864: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204119.98876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204119.98909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204119.99004: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204119.99024: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204119.99102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204120.00894: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12755 1727204120.00938: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204120.00961: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204120.01073: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpmrwreehb /root/.ansible/tmp/ansible-tmp-1727204119.9421432-15334-54159668046406/AnsiballZ_package_facts.py <<< 12755 1727204120.01082: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204119.9421432-15334-54159668046406/AnsiballZ_package_facts.py" <<< 12755 1727204120.01124: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpmrwreehb" to remote "/root/.ansible/tmp/ansible-tmp-1727204119.9421432-15334-54159668046406/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204119.9421432-15334-54159668046406/AnsiballZ_package_facts.py" <<< 12755 1727204120.04122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204120.04343: stderr chunk (state=3): >>><<< 12755 1727204120.04347: stdout chunk (state=3): >>><<< 12755 1727204120.04351: done transferring module to remote 12755 1727204120.04353: _low_level_execute_command(): starting 12755 1727204120.04356: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204119.9421432-15334-54159668046406/ /root/.ansible/tmp/ansible-tmp-1727204119.9421432-15334-54159668046406/AnsiballZ_package_facts.py && sleep 0' 12755 1727204120.05102: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204120.05225: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204120.05231: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204120.05293: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204120.05355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204120.05406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204120.07539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204120.07543: stdout chunk (state=3): >>><<< 12755 1727204120.07549: stderr chunk (state=3): >>><<< 12755 1727204120.07595: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204120.07599: _low_level_execute_command(): starting 12755 1727204120.07602: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204119.9421432-15334-54159668046406/AnsiballZ_package_facts.py && sleep 0' 12755 1727204120.08314: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204120.08335: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204120.08362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204120.08381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204120.08477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204120.08515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204120.08536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204120.08581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204120.08682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204120.74571: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 12755 1727204120.74613: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": <<< 12755 1727204120.74688: stdout chunk (state=3): >>>"rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-<<< 12755 1727204120.74701: stdout chunk (state=3): >>>libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 12755 1727204120.74806: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": n<<< 12755 1727204120.74824: stdout chunk (state=3): >>>ull, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": <<< 12755 1727204120.74902: stdout chunk (state=3): >>>"perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_<<< 12755 1727204120.74908: stdout chunk (state=3): >>>64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 12755 1727204120.77154: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204120.77158: stdout chunk (state=3): >>><<< 12755 1727204120.77161: stderr chunk (state=3): >>><<< 12755 1727204120.77535: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204120.92904: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204119.9421432-15334-54159668046406/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204120.92938: _low_level_execute_command(): starting 12755 1727204120.92951: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204119.9421432-15334-54159668046406/ > /dev/null 2>&1 && sleep 0' 12755 1727204120.93607: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204120.93626: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204120.93709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204120.93751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204120.93765: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204120.93786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204120.93864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204120.96073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204120.96296: stderr chunk (state=3): >>><<< 12755 1727204120.96299: stdout chunk (state=3): >>><<< 12755 1727204120.96302: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204120.96305: handler run complete 12755 1727204120.97885: variable 'ansible_facts' from source: unknown 12755 1727204120.98751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204121.02855: variable 'ansible_facts' from source: unknown 12755 1727204121.03709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204121.05284: attempt loop complete, returning result 12755 1727204121.05321: _execute() done 12755 1727204121.05344: dumping result to json 12755 1727204121.05699: done dumping result, returning 12755 1727204121.05777: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-72e9-1a19-00000000079c] 12755 1727204121.05781: sending task result for task 12b410aa-8751-72e9-1a19-00000000079c 12755 1727204121.15192: done sending task result for task 12b410aa-8751-72e9-1a19-00000000079c 12755 1727204121.15196: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204121.15322: no more pending results, returning what we have 12755 1727204121.15325: results queue empty 12755 1727204121.15326: checking for any_errors_fatal 12755 1727204121.15330: done checking for any_errors_fatal 12755 1727204121.15331: checking for max_fail_percentage 12755 1727204121.15333: done checking for max_fail_percentage 12755 1727204121.15334: checking to see if all hosts have failed and the running result is not ok 12755 1727204121.15335: done checking to see if all hosts have failed 12755 1727204121.15336: getting the remaining hosts for this loop 12755 1727204121.15337: done getting the remaining hosts for this loop 12755 1727204121.15341: getting the next task for host managed-node1 12755 1727204121.15351: done getting next task for host managed-node1 12755 1727204121.15355: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12755 1727204121.15359: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204121.15372: getting variables 12755 1727204121.15373: in VariableManager get_vars() 12755 1727204121.15416: Calling all_inventory to load vars for managed-node1 12755 1727204121.15419: Calling groups_inventory to load vars for managed-node1 12755 1727204121.15422: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204121.15430: Calling all_plugins_play to load vars for managed-node1 12755 1727204121.15433: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204121.15437: Calling groups_plugins_play to load vars for managed-node1 12755 1727204121.16725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204121.18876: done with get_vars() 12755 1727204121.18904: done getting variables 12755 1727204121.18948: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:55:21 -0400 (0:00:01.307) 0:00:46.425 ***** 12755 1727204121.18975: entering _queue_task() for managed-node1/debug 12755 1727204121.19258: worker is 1 (out of 1 available) 12755 1727204121.19274: exiting _queue_task() for managed-node1/debug 12755 1727204121.19291: done queuing things up, now waiting for results queue to drain 12755 1727204121.19292: waiting for pending results... 12755 1727204121.19485: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 12755 1727204121.19608: in run() - task 12b410aa-8751-72e9-1a19-0000000000d0 12755 1727204121.19625: variable 'ansible_search_path' from source: unknown 12755 1727204121.19628: variable 'ansible_search_path' from source: unknown 12755 1727204121.19668: calling self._execute() 12755 1727204121.19764: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204121.19771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204121.19781: variable 'omit' from source: magic vars 12755 1727204121.20132: variable 'ansible_distribution_major_version' from source: facts 12755 1727204121.20144: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204121.20150: variable 'omit' from source: magic vars 12755 1727204121.20210: variable 'omit' from source: magic vars 12755 1727204121.20294: variable 'network_provider' from source: set_fact 12755 1727204121.20311: variable 'omit' from source: magic vars 12755 1727204121.20350: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204121.20380: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204121.20410: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204121.20426: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204121.20437: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204121.20475: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204121.20478: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204121.20482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204121.20570: Set connection var ansible_connection to ssh 12755 1727204121.20576: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204121.20579: Set connection var ansible_shell_type to sh 12755 1727204121.20591: Set connection var ansible_timeout to 10 12755 1727204121.20598: Set connection var ansible_shell_executable to /bin/sh 12755 1727204121.20604: Set connection var ansible_pipelining to False 12755 1727204121.20631: variable 'ansible_shell_executable' from source: unknown 12755 1727204121.20635: variable 'ansible_connection' from source: unknown 12755 1727204121.20638: variable 'ansible_module_compression' from source: unknown 12755 1727204121.20641: variable 'ansible_shell_type' from source: unknown 12755 1727204121.20644: variable 'ansible_shell_executable' from source: unknown 12755 1727204121.20648: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204121.20653: variable 'ansible_pipelining' from source: unknown 12755 1727204121.20658: variable 'ansible_timeout' from source: unknown 12755 1727204121.20663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204121.20803: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204121.20815: variable 'omit' from source: magic vars 12755 1727204121.20994: starting attempt loop 12755 1727204121.20997: running the handler 12755 1727204121.21000: handler run complete 12755 1727204121.21002: attempt loop complete, returning result 12755 1727204121.21005: _execute() done 12755 1727204121.21007: dumping result to json 12755 1727204121.21009: done dumping result, returning 12755 1727204121.21011: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-72e9-1a19-0000000000d0] 12755 1727204121.21013: sending task result for task 12b410aa-8751-72e9-1a19-0000000000d0 12755 1727204121.21085: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000d0 12755 1727204121.21088: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: Using network provider: nm 12755 1727204121.21261: no more pending results, returning what we have 12755 1727204121.21264: results queue empty 12755 1727204121.21266: checking for any_errors_fatal 12755 1727204121.21275: done checking for any_errors_fatal 12755 1727204121.21276: checking for max_fail_percentage 12755 1727204121.21278: done checking for max_fail_percentage 12755 1727204121.21279: checking to see if all hosts have failed and the running result is not ok 12755 1727204121.21280: done checking to see if all hosts have failed 12755 1727204121.21281: getting the remaining hosts for this loop 12755 1727204121.21282: done getting the remaining hosts for this loop 12755 1727204121.21286: getting the next task for host managed-node1 12755 1727204121.21294: done getting next task for host managed-node1 12755 1727204121.21298: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12755 1727204121.21302: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204121.21316: getting variables 12755 1727204121.21318: in VariableManager get_vars() 12755 1727204121.21370: Calling all_inventory to load vars for managed-node1 12755 1727204121.21373: Calling groups_inventory to load vars for managed-node1 12755 1727204121.21376: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204121.21386: Calling all_plugins_play to load vars for managed-node1 12755 1727204121.21407: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204121.21413: Calling groups_plugins_play to load vars for managed-node1 12755 1727204121.23279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204121.25340: done with get_vars() 12755 1727204121.25376: done getting variables 12755 1727204121.25448: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:55:21 -0400 (0:00:00.065) 0:00:46.490 ***** 12755 1727204121.25491: entering _queue_task() for managed-node1/fail 12755 1727204121.25856: worker is 1 (out of 1 available) 12755 1727204121.25870: exiting _queue_task() for managed-node1/fail 12755 1727204121.25885: done queuing things up, now waiting for results queue to drain 12755 1727204121.25887: waiting for pending results... 12755 1727204121.26198: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12755 1727204121.26401: in run() - task 12b410aa-8751-72e9-1a19-0000000000d1 12755 1727204121.26433: variable 'ansible_search_path' from source: unknown 12755 1727204121.26444: variable 'ansible_search_path' from source: unknown 12755 1727204121.26499: calling self._execute() 12755 1727204121.26627: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204121.26640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204121.26697: variable 'omit' from source: magic vars 12755 1727204121.27153: variable 'ansible_distribution_major_version' from source: facts 12755 1727204121.27184: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204121.27437: variable 'network_state' from source: role '' defaults 12755 1727204121.27463: Evaluated conditional (network_state != {}): False 12755 1727204121.27521: when evaluation is False, skipping this task 12755 1727204121.27525: _execute() done 12755 1727204121.27528: dumping result to json 12755 1727204121.27531: done dumping result, returning 12755 1727204121.27534: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-72e9-1a19-0000000000d1] 12755 1727204121.27538: sending task result for task 12b410aa-8751-72e9-1a19-0000000000d1 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204121.27687: no more pending results, returning what we have 12755 1727204121.27692: results queue empty 12755 1727204121.27694: checking for any_errors_fatal 12755 1727204121.27703: done checking for any_errors_fatal 12755 1727204121.27704: checking for max_fail_percentage 12755 1727204121.27706: done checking for max_fail_percentage 12755 1727204121.27707: checking to see if all hosts have failed and the running result is not ok 12755 1727204121.27708: done checking to see if all hosts have failed 12755 1727204121.27709: getting the remaining hosts for this loop 12755 1727204121.27710: done getting the remaining hosts for this loop 12755 1727204121.27716: getting the next task for host managed-node1 12755 1727204121.27724: done getting next task for host managed-node1 12755 1727204121.27729: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12755 1727204121.27732: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204121.27758: getting variables 12755 1727204121.27760: in VariableManager get_vars() 12755 1727204121.27819: Calling all_inventory to load vars for managed-node1 12755 1727204121.27823: Calling groups_inventory to load vars for managed-node1 12755 1727204121.27826: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204121.27833: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000d1 12755 1727204121.27836: WORKER PROCESS EXITING 12755 1727204121.27846: Calling all_plugins_play to load vars for managed-node1 12755 1727204121.27850: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204121.27853: Calling groups_plugins_play to load vars for managed-node1 12755 1727204121.30055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204121.32647: done with get_vars() 12755 1727204121.32679: done getting variables 12755 1727204121.32736: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:55:21 -0400 (0:00:00.072) 0:00:46.563 ***** 12755 1727204121.32766: entering _queue_task() for managed-node1/fail 12755 1727204121.33047: worker is 1 (out of 1 available) 12755 1727204121.33063: exiting _queue_task() for managed-node1/fail 12755 1727204121.33077: done queuing things up, now waiting for results queue to drain 12755 1727204121.33079: waiting for pending results... 12755 1727204121.33283: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12755 1727204121.33595: in run() - task 12b410aa-8751-72e9-1a19-0000000000d2 12755 1727204121.33600: variable 'ansible_search_path' from source: unknown 12755 1727204121.33603: variable 'ansible_search_path' from source: unknown 12755 1727204121.33605: calling self._execute() 12755 1727204121.33662: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204121.33676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204121.33696: variable 'omit' from source: magic vars 12755 1727204121.34166: variable 'ansible_distribution_major_version' from source: facts 12755 1727204121.34185: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204121.34346: variable 'network_state' from source: role '' defaults 12755 1727204121.34365: Evaluated conditional (network_state != {}): False 12755 1727204121.34373: when evaluation is False, skipping this task 12755 1727204121.34380: _execute() done 12755 1727204121.34388: dumping result to json 12755 1727204121.34399: done dumping result, returning 12755 1727204121.34412: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-72e9-1a19-0000000000d2] 12755 1727204121.34426: sending task result for task 12b410aa-8751-72e9-1a19-0000000000d2 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204121.34594: no more pending results, returning what we have 12755 1727204121.34599: results queue empty 12755 1727204121.34600: checking for any_errors_fatal 12755 1727204121.34608: done checking for any_errors_fatal 12755 1727204121.34609: checking for max_fail_percentage 12755 1727204121.34611: done checking for max_fail_percentage 12755 1727204121.34612: checking to see if all hosts have failed and the running result is not ok 12755 1727204121.34613: done checking to see if all hosts have failed 12755 1727204121.34614: getting the remaining hosts for this loop 12755 1727204121.34615: done getting the remaining hosts for this loop 12755 1727204121.34627: getting the next task for host managed-node1 12755 1727204121.34636: done getting next task for host managed-node1 12755 1727204121.34640: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12755 1727204121.34643: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204121.34670: getting variables 12755 1727204121.34672: in VariableManager get_vars() 12755 1727204121.34851: Calling all_inventory to load vars for managed-node1 12755 1727204121.34854: Calling groups_inventory to load vars for managed-node1 12755 1727204121.34857: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204121.34869: Calling all_plugins_play to load vars for managed-node1 12755 1727204121.34872: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204121.34882: Calling groups_plugins_play to load vars for managed-node1 12755 1727204121.35407: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000d2 12755 1727204121.35411: WORKER PROCESS EXITING 12755 1727204121.37281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204121.40263: done with get_vars() 12755 1727204121.40310: done getting variables 12755 1727204121.40384: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:55:21 -0400 (0:00:00.076) 0:00:46.640 ***** 12755 1727204121.40430: entering _queue_task() for managed-node1/fail 12755 1727204121.40803: worker is 1 (out of 1 available) 12755 1727204121.40821: exiting _queue_task() for managed-node1/fail 12755 1727204121.40835: done queuing things up, now waiting for results queue to drain 12755 1727204121.40837: waiting for pending results... 12755 1727204121.41222: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12755 1727204121.41362: in run() - task 12b410aa-8751-72e9-1a19-0000000000d3 12755 1727204121.41384: variable 'ansible_search_path' from source: unknown 12755 1727204121.41395: variable 'ansible_search_path' from source: unknown 12755 1727204121.41447: calling self._execute() 12755 1727204121.41577: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204121.41641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204121.41645: variable 'omit' from source: magic vars 12755 1727204121.42086: variable 'ansible_distribution_major_version' from source: facts 12755 1727204121.42108: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204121.42351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204121.45039: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204121.45130: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204121.45295: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204121.45300: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204121.45303: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204121.45371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204121.45437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204121.45477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204121.45542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204121.45567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204121.45702: variable 'ansible_distribution_major_version' from source: facts 12755 1727204121.45731: Evaluated conditional (ansible_distribution_major_version | int > 9): True 12755 1727204121.45898: variable 'ansible_distribution' from source: facts 12755 1727204121.45910: variable '__network_rh_distros' from source: role '' defaults 12755 1727204121.45931: Evaluated conditional (ansible_distribution in __network_rh_distros): False 12755 1727204121.45941: when evaluation is False, skipping this task 12755 1727204121.45950: _execute() done 12755 1727204121.45965: dumping result to json 12755 1727204121.45976: done dumping result, returning 12755 1727204121.45992: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-72e9-1a19-0000000000d3] 12755 1727204121.46004: sending task result for task 12b410aa-8751-72e9-1a19-0000000000d3 12755 1727204121.46247: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000d3 12755 1727204121.46251: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 12755 1727204121.46308: no more pending results, returning what we have 12755 1727204121.46312: results queue empty 12755 1727204121.46316: checking for any_errors_fatal 12755 1727204121.46323: done checking for any_errors_fatal 12755 1727204121.46324: checking for max_fail_percentage 12755 1727204121.46326: done checking for max_fail_percentage 12755 1727204121.46328: checking to see if all hosts have failed and the running result is not ok 12755 1727204121.46329: done checking to see if all hosts have failed 12755 1727204121.46330: getting the remaining hosts for this loop 12755 1727204121.46332: done getting the remaining hosts for this loop 12755 1727204121.46338: getting the next task for host managed-node1 12755 1727204121.46347: done getting next task for host managed-node1 12755 1727204121.46352: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12755 1727204121.46356: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204121.46385: getting variables 12755 1727204121.46387: in VariableManager get_vars() 12755 1727204121.46458: Calling all_inventory to load vars for managed-node1 12755 1727204121.46462: Calling groups_inventory to load vars for managed-node1 12755 1727204121.46465: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204121.46479: Calling all_plugins_play to load vars for managed-node1 12755 1727204121.46483: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204121.46487: Calling groups_plugins_play to load vars for managed-node1 12755 1727204121.48001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204121.50248: done with get_vars() 12755 1727204121.50277: done getting variables 12755 1727204121.50336: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:55:21 -0400 (0:00:00.099) 0:00:46.739 ***** 12755 1727204121.50364: entering _queue_task() for managed-node1/dnf 12755 1727204121.50643: worker is 1 (out of 1 available) 12755 1727204121.50660: exiting _queue_task() for managed-node1/dnf 12755 1727204121.50674: done queuing things up, now waiting for results queue to drain 12755 1727204121.50675: waiting for pending results... 12755 1727204121.50871: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12755 1727204121.50984: in run() - task 12b410aa-8751-72e9-1a19-0000000000d4 12755 1727204121.50998: variable 'ansible_search_path' from source: unknown 12755 1727204121.51001: variable 'ansible_search_path' from source: unknown 12755 1727204121.51037: calling self._execute() 12755 1727204121.51129: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204121.51136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204121.51149: variable 'omit' from source: magic vars 12755 1727204121.51499: variable 'ansible_distribution_major_version' from source: facts 12755 1727204121.51506: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204121.51741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204121.54030: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204121.54083: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204121.54194: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204121.54196: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204121.54198: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204121.54249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204121.54282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204121.54307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204121.54346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204121.54359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204121.54462: variable 'ansible_distribution' from source: facts 12755 1727204121.54466: variable 'ansible_distribution_major_version' from source: facts 12755 1727204121.54474: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 12755 1727204121.54573: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204121.54688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204121.54710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204121.54732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204121.54769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204121.54782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204121.54865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204121.54870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204121.54872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204121.54888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204121.54903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204121.54938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204121.54957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204121.54982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204121.55195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204121.55199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204121.55280: variable 'network_connections' from source: task vars 12755 1727204121.55302: variable 'controller_profile' from source: play vars 12755 1727204121.55383: variable 'controller_profile' from source: play vars 12755 1727204121.55416: variable 'controller_device' from source: play vars 12755 1727204121.55501: variable 'controller_device' from source: play vars 12755 1727204121.55521: variable 'port1_profile' from source: play vars 12755 1727204121.55598: variable 'port1_profile' from source: play vars 12755 1727204121.55612: variable 'dhcp_interface1' from source: play vars 12755 1727204121.55692: variable 'dhcp_interface1' from source: play vars 12755 1727204121.55712: variable 'controller_profile' from source: play vars 12755 1727204121.55794: variable 'controller_profile' from source: play vars 12755 1727204121.55808: variable 'port2_profile' from source: play vars 12755 1727204121.55885: variable 'port2_profile' from source: play vars 12755 1727204121.55902: variable 'dhcp_interface2' from source: play vars 12755 1727204121.55986: variable 'dhcp_interface2' from source: play vars 12755 1727204121.56001: variable 'controller_profile' from source: play vars 12755 1727204121.56076: variable 'controller_profile' from source: play vars 12755 1727204121.56177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204121.56396: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204121.56457: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204121.56502: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204121.56544: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204121.56604: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204121.56643: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204121.56722: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204121.56732: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204121.56811: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204121.57180: variable 'network_connections' from source: task vars 12755 1727204121.57190: variable 'controller_profile' from source: play vars 12755 1727204121.57242: variable 'controller_profile' from source: play vars 12755 1727204121.57249: variable 'controller_device' from source: play vars 12755 1727204121.57322: variable 'controller_device' from source: play vars 12755 1727204121.57425: variable 'port1_profile' from source: play vars 12755 1727204121.57434: variable 'port1_profile' from source: play vars 12755 1727204121.57437: variable 'dhcp_interface1' from source: play vars 12755 1727204121.57495: variable 'dhcp_interface1' from source: play vars 12755 1727204121.57499: variable 'controller_profile' from source: play vars 12755 1727204121.57698: variable 'controller_profile' from source: play vars 12755 1727204121.57701: variable 'port2_profile' from source: play vars 12755 1727204121.57703: variable 'port2_profile' from source: play vars 12755 1727204121.57706: variable 'dhcp_interface2' from source: play vars 12755 1727204121.57738: variable 'dhcp_interface2' from source: play vars 12755 1727204121.57751: variable 'controller_profile' from source: play vars 12755 1727204121.57828: variable 'controller_profile' from source: play vars 12755 1727204121.57872: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12755 1727204121.57881: when evaluation is False, skipping this task 12755 1727204121.57888: _execute() done 12755 1727204121.57901: dumping result to json 12755 1727204121.57909: done dumping result, returning 12755 1727204121.57923: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-72e9-1a19-0000000000d4] 12755 1727204121.57934: sending task result for task 12b410aa-8751-72e9-1a19-0000000000d4 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12755 1727204121.58250: no more pending results, returning what we have 12755 1727204121.58254: results queue empty 12755 1727204121.58256: checking for any_errors_fatal 12755 1727204121.58263: done checking for any_errors_fatal 12755 1727204121.58264: checking for max_fail_percentage 12755 1727204121.58266: done checking for max_fail_percentage 12755 1727204121.58267: checking to see if all hosts have failed and the running result is not ok 12755 1727204121.58268: done checking to see if all hosts have failed 12755 1727204121.58269: getting the remaining hosts for this loop 12755 1727204121.58271: done getting the remaining hosts for this loop 12755 1727204121.58277: getting the next task for host managed-node1 12755 1727204121.58285: done getting next task for host managed-node1 12755 1727204121.58293: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12755 1727204121.58296: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204121.58321: getting variables 12755 1727204121.58324: in VariableManager get_vars() 12755 1727204121.58599: Calling all_inventory to load vars for managed-node1 12755 1727204121.58603: Calling groups_inventory to load vars for managed-node1 12755 1727204121.58607: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204121.58617: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000d4 12755 1727204121.58620: WORKER PROCESS EXITING 12755 1727204121.58640: Calling all_plugins_play to load vars for managed-node1 12755 1727204121.58669: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204121.58675: Calling groups_plugins_play to load vars for managed-node1 12755 1727204121.59942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204121.62111: done with get_vars() 12755 1727204121.62142: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12755 1727204121.62209: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:55:21 -0400 (0:00:00.118) 0:00:46.858 ***** 12755 1727204121.62239: entering _queue_task() for managed-node1/yum 12755 1727204121.62517: worker is 1 (out of 1 available) 12755 1727204121.62535: exiting _queue_task() for managed-node1/yum 12755 1727204121.62547: done queuing things up, now waiting for results queue to drain 12755 1727204121.62549: waiting for pending results... 12755 1727204121.62759: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12755 1727204121.62873: in run() - task 12b410aa-8751-72e9-1a19-0000000000d5 12755 1727204121.62889: variable 'ansible_search_path' from source: unknown 12755 1727204121.62896: variable 'ansible_search_path' from source: unknown 12755 1727204121.62933: calling self._execute() 12755 1727204121.63027: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204121.63033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204121.63043: variable 'omit' from source: magic vars 12755 1727204121.63380: variable 'ansible_distribution_major_version' from source: facts 12755 1727204121.63398: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204121.63562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204121.65657: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204121.65710: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204121.65748: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204121.65778: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204121.65808: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204121.65882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204121.65908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204121.65932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204121.65971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204121.65983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204121.66068: variable 'ansible_distribution_major_version' from source: facts 12755 1727204121.66083: Evaluated conditional (ansible_distribution_major_version | int < 8): False 12755 1727204121.66086: when evaluation is False, skipping this task 12755 1727204121.66091: _execute() done 12755 1727204121.66096: dumping result to json 12755 1727204121.66100: done dumping result, returning 12755 1727204121.66109: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-72e9-1a19-0000000000d5] 12755 1727204121.66117: sending task result for task 12b410aa-8751-72e9-1a19-0000000000d5 12755 1727204121.66219: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000d5 12755 1727204121.66222: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 12755 1727204121.66278: no more pending results, returning what we have 12755 1727204121.66282: results queue empty 12755 1727204121.66284: checking for any_errors_fatal 12755 1727204121.66293: done checking for any_errors_fatal 12755 1727204121.66294: checking for max_fail_percentage 12755 1727204121.66297: done checking for max_fail_percentage 12755 1727204121.66298: checking to see if all hosts have failed and the running result is not ok 12755 1727204121.66299: done checking to see if all hosts have failed 12755 1727204121.66300: getting the remaining hosts for this loop 12755 1727204121.66302: done getting the remaining hosts for this loop 12755 1727204121.66307: getting the next task for host managed-node1 12755 1727204121.66314: done getting next task for host managed-node1 12755 1727204121.66318: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12755 1727204121.66321: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204121.66347: getting variables 12755 1727204121.66349: in VariableManager get_vars() 12755 1727204121.66414: Calling all_inventory to load vars for managed-node1 12755 1727204121.66417: Calling groups_inventory to load vars for managed-node1 12755 1727204121.66420: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204121.66431: Calling all_plugins_play to load vars for managed-node1 12755 1727204121.66434: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204121.66438: Calling groups_plugins_play to load vars for managed-node1 12755 1727204121.67808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204121.69403: done with get_vars() 12755 1727204121.69427: done getting variables 12755 1727204121.69477: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:55:21 -0400 (0:00:00.072) 0:00:46.930 ***** 12755 1727204121.69507: entering _queue_task() for managed-node1/fail 12755 1727204121.69766: worker is 1 (out of 1 available) 12755 1727204121.69781: exiting _queue_task() for managed-node1/fail 12755 1727204121.69795: done queuing things up, now waiting for results queue to drain 12755 1727204121.69797: waiting for pending results... 12755 1727204121.69997: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12755 1727204121.70103: in run() - task 12b410aa-8751-72e9-1a19-0000000000d6 12755 1727204121.70118: variable 'ansible_search_path' from source: unknown 12755 1727204121.70122: variable 'ansible_search_path' from source: unknown 12755 1727204121.70155: calling self._execute() 12755 1727204121.70251: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204121.70259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204121.70270: variable 'omit' from source: magic vars 12755 1727204121.70598: variable 'ansible_distribution_major_version' from source: facts 12755 1727204121.70614: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204121.70715: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204121.70884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204121.72616: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204121.72669: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204121.72701: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204121.72733: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204121.72761: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204121.72831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204121.72871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204121.72893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204121.72927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204121.72940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204121.72984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204121.73005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204121.73027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204121.73058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204121.73071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204121.73110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204121.73131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204121.73151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204121.73183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204121.73199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204121.73345: variable 'network_connections' from source: task vars 12755 1727204121.73355: variable 'controller_profile' from source: play vars 12755 1727204121.73410: variable 'controller_profile' from source: play vars 12755 1727204121.73421: variable 'controller_device' from source: play vars 12755 1727204121.73471: variable 'controller_device' from source: play vars 12755 1727204121.73480: variable 'port1_profile' from source: play vars 12755 1727204121.73535: variable 'port1_profile' from source: play vars 12755 1727204121.73543: variable 'dhcp_interface1' from source: play vars 12755 1727204121.73593: variable 'dhcp_interface1' from source: play vars 12755 1727204121.73600: variable 'controller_profile' from source: play vars 12755 1727204121.73654: variable 'controller_profile' from source: play vars 12755 1727204121.73661: variable 'port2_profile' from source: play vars 12755 1727204121.73714: variable 'port2_profile' from source: play vars 12755 1727204121.73720: variable 'dhcp_interface2' from source: play vars 12755 1727204121.73772: variable 'dhcp_interface2' from source: play vars 12755 1727204121.73779: variable 'controller_profile' from source: play vars 12755 1727204121.73830: variable 'controller_profile' from source: play vars 12755 1727204121.73893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204121.74030: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204121.74062: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204121.74096: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204121.74124: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204121.74161: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204121.74186: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204121.74211: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204121.74234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204121.74293: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204121.74513: variable 'network_connections' from source: task vars 12755 1727204121.74520: variable 'controller_profile' from source: play vars 12755 1727204121.74571: variable 'controller_profile' from source: play vars 12755 1727204121.74578: variable 'controller_device' from source: play vars 12755 1727204121.74634: variable 'controller_device' from source: play vars 12755 1727204121.74643: variable 'port1_profile' from source: play vars 12755 1727204121.74693: variable 'port1_profile' from source: play vars 12755 1727204121.74700: variable 'dhcp_interface1' from source: play vars 12755 1727204121.74755: variable 'dhcp_interface1' from source: play vars 12755 1727204121.74761: variable 'controller_profile' from source: play vars 12755 1727204121.74811: variable 'controller_profile' from source: play vars 12755 1727204121.74825: variable 'port2_profile' from source: play vars 12755 1727204121.74871: variable 'port2_profile' from source: play vars 12755 1727204121.74878: variable 'dhcp_interface2' from source: play vars 12755 1727204121.74932: variable 'dhcp_interface2' from source: play vars 12755 1727204121.74937: variable 'controller_profile' from source: play vars 12755 1727204121.74988: variable 'controller_profile' from source: play vars 12755 1727204121.75019: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12755 1727204121.75022: when evaluation is False, skipping this task 12755 1727204121.75025: _execute() done 12755 1727204121.75030: dumping result to json 12755 1727204121.75033: done dumping result, returning 12755 1727204121.75044: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-72e9-1a19-0000000000d6] 12755 1727204121.75050: sending task result for task 12b410aa-8751-72e9-1a19-0000000000d6 12755 1727204121.75152: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000d6 12755 1727204121.75157: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12755 1727204121.75230: no more pending results, returning what we have 12755 1727204121.75233: results queue empty 12755 1727204121.75235: checking for any_errors_fatal 12755 1727204121.75243: done checking for any_errors_fatal 12755 1727204121.75244: checking for max_fail_percentage 12755 1727204121.75246: done checking for max_fail_percentage 12755 1727204121.75247: checking to see if all hosts have failed and the running result is not ok 12755 1727204121.75248: done checking to see if all hosts have failed 12755 1727204121.75249: getting the remaining hosts for this loop 12755 1727204121.75251: done getting the remaining hosts for this loop 12755 1727204121.75256: getting the next task for host managed-node1 12755 1727204121.75263: done getting next task for host managed-node1 12755 1727204121.75268: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12755 1727204121.75271: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204121.75297: getting variables 12755 1727204121.75299: in VariableManager get_vars() 12755 1727204121.75360: Calling all_inventory to load vars for managed-node1 12755 1727204121.75365: Calling groups_inventory to load vars for managed-node1 12755 1727204121.75367: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204121.75379: Calling all_plugins_play to load vars for managed-node1 12755 1727204121.75382: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204121.75386: Calling groups_plugins_play to load vars for managed-node1 12755 1727204121.76738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204121.78323: done with get_vars() 12755 1727204121.78348: done getting variables 12755 1727204121.78401: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:55:21 -0400 (0:00:00.089) 0:00:47.020 ***** 12755 1727204121.78430: entering _queue_task() for managed-node1/package 12755 1727204121.78697: worker is 1 (out of 1 available) 12755 1727204121.78714: exiting _queue_task() for managed-node1/package 12755 1727204121.78727: done queuing things up, now waiting for results queue to drain 12755 1727204121.78729: waiting for pending results... 12755 1727204121.78925: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 12755 1727204121.79036: in run() - task 12b410aa-8751-72e9-1a19-0000000000d7 12755 1727204121.79048: variable 'ansible_search_path' from source: unknown 12755 1727204121.79052: variable 'ansible_search_path' from source: unknown 12755 1727204121.79090: calling self._execute() 12755 1727204121.79183: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204121.79197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204121.79208: variable 'omit' from source: magic vars 12755 1727204121.79550: variable 'ansible_distribution_major_version' from source: facts 12755 1727204121.79562: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204121.79741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204121.79964: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204121.80004: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204121.80034: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204121.80094: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204121.80186: variable 'network_packages' from source: role '' defaults 12755 1727204121.80278: variable '__network_provider_setup' from source: role '' defaults 12755 1727204121.80288: variable '__network_service_name_default_nm' from source: role '' defaults 12755 1727204121.80343: variable '__network_service_name_default_nm' from source: role '' defaults 12755 1727204121.80352: variable '__network_packages_default_nm' from source: role '' defaults 12755 1727204121.80407: variable '__network_packages_default_nm' from source: role '' defaults 12755 1727204121.80562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204121.82146: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204121.82197: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204121.82233: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204121.82263: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204121.82285: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204121.82360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204121.82383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204121.82406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204121.82442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204121.82459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204121.82497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204121.82518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204121.82541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204121.82574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204121.82587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204121.82778: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12755 1727204121.82873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204121.82901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204121.82922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204121.82953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204121.82965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204121.83045: variable 'ansible_python' from source: facts 12755 1727204121.83067: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12755 1727204121.83139: variable '__network_wpa_supplicant_required' from source: role '' defaults 12755 1727204121.83223: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12755 1727204121.83327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204121.83349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204121.83369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204121.83402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204121.83418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204121.83459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204121.83481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204121.83507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204121.83546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204121.83562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204121.83681: variable 'network_connections' from source: task vars 12755 1727204121.83687: variable 'controller_profile' from source: play vars 12755 1727204121.83774: variable 'controller_profile' from source: play vars 12755 1727204121.83783: variable 'controller_device' from source: play vars 12755 1727204121.83866: variable 'controller_device' from source: play vars 12755 1727204121.83880: variable 'port1_profile' from source: play vars 12755 1727204121.83962: variable 'port1_profile' from source: play vars 12755 1727204121.83972: variable 'dhcp_interface1' from source: play vars 12755 1727204121.84053: variable 'dhcp_interface1' from source: play vars 12755 1727204121.84061: variable 'controller_profile' from source: play vars 12755 1727204121.84145: variable 'controller_profile' from source: play vars 12755 1727204121.84153: variable 'port2_profile' from source: play vars 12755 1727204121.84238: variable 'port2_profile' from source: play vars 12755 1727204121.84247: variable 'dhcp_interface2' from source: play vars 12755 1727204121.84331: variable 'dhcp_interface2' from source: play vars 12755 1727204121.84340: variable 'controller_profile' from source: play vars 12755 1727204121.84424: variable 'controller_profile' from source: play vars 12755 1727204121.84480: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204121.84507: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204121.84537: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204121.84562: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204121.84606: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204121.84858: variable 'network_connections' from source: task vars 12755 1727204121.84862: variable 'controller_profile' from source: play vars 12755 1727204121.84936: variable 'controller_profile' from source: play vars 12755 1727204121.84944: variable 'controller_device' from source: play vars 12755 1727204121.85029: variable 'controller_device' from source: play vars 12755 1727204121.85039: variable 'port1_profile' from source: play vars 12755 1727204121.85125: variable 'port1_profile' from source: play vars 12755 1727204121.85134: variable 'dhcp_interface1' from source: play vars 12755 1727204121.85220: variable 'dhcp_interface1' from source: play vars 12755 1727204121.85228: variable 'controller_profile' from source: play vars 12755 1727204121.85311: variable 'controller_profile' from source: play vars 12755 1727204121.85322: variable 'port2_profile' from source: play vars 12755 1727204121.85402: variable 'port2_profile' from source: play vars 12755 1727204121.85411: variable 'dhcp_interface2' from source: play vars 12755 1727204121.85490: variable 'dhcp_interface2' from source: play vars 12755 1727204121.85502: variable 'controller_profile' from source: play vars 12755 1727204121.85582: variable 'controller_profile' from source: play vars 12755 1727204121.85633: variable '__network_packages_default_wireless' from source: role '' defaults 12755 1727204121.85698: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204121.85959: variable 'network_connections' from source: task vars 12755 1727204121.85962: variable 'controller_profile' from source: play vars 12755 1727204121.86020: variable 'controller_profile' from source: play vars 12755 1727204121.86028: variable 'controller_device' from source: play vars 12755 1727204121.86082: variable 'controller_device' from source: play vars 12755 1727204121.86092: variable 'port1_profile' from source: play vars 12755 1727204121.86147: variable 'port1_profile' from source: play vars 12755 1727204121.86156: variable 'dhcp_interface1' from source: play vars 12755 1727204121.86210: variable 'dhcp_interface1' from source: play vars 12755 1727204121.86218: variable 'controller_profile' from source: play vars 12755 1727204121.86273: variable 'controller_profile' from source: play vars 12755 1727204121.86281: variable 'port2_profile' from source: play vars 12755 1727204121.86336: variable 'port2_profile' from source: play vars 12755 1727204121.86344: variable 'dhcp_interface2' from source: play vars 12755 1727204121.86401: variable 'dhcp_interface2' from source: play vars 12755 1727204121.86407: variable 'controller_profile' from source: play vars 12755 1727204121.86462: variable 'controller_profile' from source: play vars 12755 1727204121.86487: variable '__network_packages_default_team' from source: role '' defaults 12755 1727204121.86555: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204121.86814: variable 'network_connections' from source: task vars 12755 1727204121.86822: variable 'controller_profile' from source: play vars 12755 1727204121.86874: variable 'controller_profile' from source: play vars 12755 1727204121.86881: variable 'controller_device' from source: play vars 12755 1727204121.86940: variable 'controller_device' from source: play vars 12755 1727204121.86949: variable 'port1_profile' from source: play vars 12755 1727204121.87002: variable 'port1_profile' from source: play vars 12755 1727204121.87010: variable 'dhcp_interface1' from source: play vars 12755 1727204121.87066: variable 'dhcp_interface1' from source: play vars 12755 1727204121.87073: variable 'controller_profile' from source: play vars 12755 1727204121.87129: variable 'controller_profile' from source: play vars 12755 1727204121.87138: variable 'port2_profile' from source: play vars 12755 1727204121.87193: variable 'port2_profile' from source: play vars 12755 1727204121.87200: variable 'dhcp_interface2' from source: play vars 12755 1727204121.87256: variable 'dhcp_interface2' from source: play vars 12755 1727204121.87266: variable 'controller_profile' from source: play vars 12755 1727204121.87321: variable 'controller_profile' from source: play vars 12755 1727204121.87383: variable '__network_service_name_default_initscripts' from source: role '' defaults 12755 1727204121.87444: variable '__network_service_name_default_initscripts' from source: role '' defaults 12755 1727204121.87453: variable '__network_packages_default_initscripts' from source: role '' defaults 12755 1727204121.87506: variable '__network_packages_default_initscripts' from source: role '' defaults 12755 1727204121.87688: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12755 1727204121.88093: variable 'network_connections' from source: task vars 12755 1727204121.88097: variable 'controller_profile' from source: play vars 12755 1727204121.88152: variable 'controller_profile' from source: play vars 12755 1727204121.88160: variable 'controller_device' from source: play vars 12755 1727204121.88213: variable 'controller_device' from source: play vars 12755 1727204121.88224: variable 'port1_profile' from source: play vars 12755 1727204121.88272: variable 'port1_profile' from source: play vars 12755 1727204121.88279: variable 'dhcp_interface1' from source: play vars 12755 1727204121.88330: variable 'dhcp_interface1' from source: play vars 12755 1727204121.88340: variable 'controller_profile' from source: play vars 12755 1727204121.88388: variable 'controller_profile' from source: play vars 12755 1727204121.88397: variable 'port2_profile' from source: play vars 12755 1727204121.88449: variable 'port2_profile' from source: play vars 12755 1727204121.88454: variable 'dhcp_interface2' from source: play vars 12755 1727204121.88505: variable 'dhcp_interface2' from source: play vars 12755 1727204121.88514: variable 'controller_profile' from source: play vars 12755 1727204121.88564: variable 'controller_profile' from source: play vars 12755 1727204121.88572: variable 'ansible_distribution' from source: facts 12755 1727204121.88577: variable '__network_rh_distros' from source: role '' defaults 12755 1727204121.88585: variable 'ansible_distribution_major_version' from source: facts 12755 1727204121.88608: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12755 1727204121.88833: variable 'ansible_distribution' from source: facts 12755 1727204121.88837: variable '__network_rh_distros' from source: role '' defaults 12755 1727204121.88839: variable 'ansible_distribution_major_version' from source: facts 12755 1727204121.88842: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12755 1727204121.88949: variable 'ansible_distribution' from source: facts 12755 1727204121.88953: variable '__network_rh_distros' from source: role '' defaults 12755 1727204121.88959: variable 'ansible_distribution_major_version' from source: facts 12755 1727204121.88988: variable 'network_provider' from source: set_fact 12755 1727204121.89003: variable 'ansible_facts' from source: unknown 12755 1727204121.89680: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 12755 1727204121.89684: when evaluation is False, skipping this task 12755 1727204121.89687: _execute() done 12755 1727204121.89692: dumping result to json 12755 1727204121.89697: done dumping result, returning 12755 1727204121.89707: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-72e9-1a19-0000000000d7] 12755 1727204121.89714: sending task result for task 12b410aa-8751-72e9-1a19-0000000000d7 12755 1727204121.89815: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000d7 12755 1727204121.89818: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 12755 1727204121.89874: no more pending results, returning what we have 12755 1727204121.89877: results queue empty 12755 1727204121.89879: checking for any_errors_fatal 12755 1727204121.89884: done checking for any_errors_fatal 12755 1727204121.89885: checking for max_fail_percentage 12755 1727204121.89887: done checking for max_fail_percentage 12755 1727204121.89888: checking to see if all hosts have failed and the running result is not ok 12755 1727204121.89891: done checking to see if all hosts have failed 12755 1727204121.89892: getting the remaining hosts for this loop 12755 1727204121.89894: done getting the remaining hosts for this loop 12755 1727204121.89899: getting the next task for host managed-node1 12755 1727204121.89907: done getting next task for host managed-node1 12755 1727204121.89913: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12755 1727204121.89916: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204121.89943: getting variables 12755 1727204121.89945: in VariableManager get_vars() 12755 1727204121.90009: Calling all_inventory to load vars for managed-node1 12755 1727204121.90014: Calling groups_inventory to load vars for managed-node1 12755 1727204121.90017: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204121.90029: Calling all_plugins_play to load vars for managed-node1 12755 1727204121.90032: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204121.90036: Calling groups_plugins_play to load vars for managed-node1 12755 1727204121.91342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204121.93713: done with get_vars() 12755 1727204121.93760: done getting variables 12755 1727204121.93836: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:55:21 -0400 (0:00:00.154) 0:00:47.174 ***** 12755 1727204121.93880: entering _queue_task() for managed-node1/package 12755 1727204121.94268: worker is 1 (out of 1 available) 12755 1727204121.94284: exiting _queue_task() for managed-node1/package 12755 1727204121.94466: done queuing things up, now waiting for results queue to drain 12755 1727204121.94468: waiting for pending results... 12755 1727204121.94709: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12755 1727204121.95098: in run() - task 12b410aa-8751-72e9-1a19-0000000000d8 12755 1727204121.95103: variable 'ansible_search_path' from source: unknown 12755 1727204121.95106: variable 'ansible_search_path' from source: unknown 12755 1727204121.95497: calling self._execute() 12755 1727204121.95501: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204121.95504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204121.95508: variable 'omit' from source: magic vars 12755 1727204121.95962: variable 'ansible_distribution_major_version' from source: facts 12755 1727204121.95983: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204121.96158: variable 'network_state' from source: role '' defaults 12755 1727204121.96178: Evaluated conditional (network_state != {}): False 12755 1727204121.96186: when evaluation is False, skipping this task 12755 1727204121.96197: _execute() done 12755 1727204121.96205: dumping result to json 12755 1727204121.96213: done dumping result, returning 12755 1727204121.96229: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-72e9-1a19-0000000000d8] 12755 1727204121.96241: sending task result for task 12b410aa-8751-72e9-1a19-0000000000d8 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204121.96432: no more pending results, returning what we have 12755 1727204121.96437: results queue empty 12755 1727204121.96439: checking for any_errors_fatal 12755 1727204121.96449: done checking for any_errors_fatal 12755 1727204121.96450: checking for max_fail_percentage 12755 1727204121.96452: done checking for max_fail_percentage 12755 1727204121.96454: checking to see if all hosts have failed and the running result is not ok 12755 1727204121.96455: done checking to see if all hosts have failed 12755 1727204121.96456: getting the remaining hosts for this loop 12755 1727204121.96458: done getting the remaining hosts for this loop 12755 1727204121.96463: getting the next task for host managed-node1 12755 1727204121.96472: done getting next task for host managed-node1 12755 1727204121.96477: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12755 1727204121.96481: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204121.96511: getting variables 12755 1727204121.96514: in VariableManager get_vars() 12755 1727204121.96588: Calling all_inventory to load vars for managed-node1 12755 1727204121.96798: Calling groups_inventory to load vars for managed-node1 12755 1727204121.96803: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204121.96818: Calling all_plugins_play to load vars for managed-node1 12755 1727204121.96823: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204121.96827: Calling groups_plugins_play to load vars for managed-node1 12755 1727204121.97428: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000d8 12755 1727204121.97432: WORKER PROCESS EXITING 12755 1727204121.98843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204122.01730: done with get_vars() 12755 1727204122.01773: done getting variables 12755 1727204122.01849: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:55:22 -0400 (0:00:00.080) 0:00:47.254 ***** 12755 1727204122.01892: entering _queue_task() for managed-node1/package 12755 1727204122.02249: worker is 1 (out of 1 available) 12755 1727204122.02263: exiting _queue_task() for managed-node1/package 12755 1727204122.02277: done queuing things up, now waiting for results queue to drain 12755 1727204122.02278: waiting for pending results... 12755 1727204122.02598: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12755 1727204122.02796: in run() - task 12b410aa-8751-72e9-1a19-0000000000d9 12755 1727204122.02821: variable 'ansible_search_path' from source: unknown 12755 1727204122.02833: variable 'ansible_search_path' from source: unknown 12755 1727204122.02875: calling self._execute() 12755 1727204122.03001: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204122.03017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204122.03035: variable 'omit' from source: magic vars 12755 1727204122.03503: variable 'ansible_distribution_major_version' from source: facts 12755 1727204122.03525: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204122.03691: variable 'network_state' from source: role '' defaults 12755 1727204122.03715: Evaluated conditional (network_state != {}): False 12755 1727204122.03724: when evaluation is False, skipping this task 12755 1727204122.03731: _execute() done 12755 1727204122.03739: dumping result to json 12755 1727204122.03746: done dumping result, returning 12755 1727204122.03759: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-72e9-1a19-0000000000d9] 12755 1727204122.03771: sending task result for task 12b410aa-8751-72e9-1a19-0000000000d9 12755 1727204122.03975: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000d9 12755 1727204122.03978: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204122.04036: no more pending results, returning what we have 12755 1727204122.04041: results queue empty 12755 1727204122.04042: checking for any_errors_fatal 12755 1727204122.04049: done checking for any_errors_fatal 12755 1727204122.04050: checking for max_fail_percentage 12755 1727204122.04053: done checking for max_fail_percentage 12755 1727204122.04054: checking to see if all hosts have failed and the running result is not ok 12755 1727204122.04056: done checking to see if all hosts have failed 12755 1727204122.04057: getting the remaining hosts for this loop 12755 1727204122.04058: done getting the remaining hosts for this loop 12755 1727204122.04063: getting the next task for host managed-node1 12755 1727204122.04072: done getting next task for host managed-node1 12755 1727204122.04077: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12755 1727204122.04082: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204122.04114: getting variables 12755 1727204122.04116: in VariableManager get_vars() 12755 1727204122.04181: Calling all_inventory to load vars for managed-node1 12755 1727204122.04185: Calling groups_inventory to load vars for managed-node1 12755 1727204122.04188: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204122.04403: Calling all_plugins_play to load vars for managed-node1 12755 1727204122.04407: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204122.04413: Calling groups_plugins_play to load vars for managed-node1 12755 1727204122.06749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204122.09678: done with get_vars() 12755 1727204122.09729: done getting variables 12755 1727204122.09808: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:55:22 -0400 (0:00:00.079) 0:00:47.334 ***** 12755 1727204122.09853: entering _queue_task() for managed-node1/service 12755 1727204122.10233: worker is 1 (out of 1 available) 12755 1727204122.10249: exiting _queue_task() for managed-node1/service 12755 1727204122.10263: done queuing things up, now waiting for results queue to drain 12755 1727204122.10264: waiting for pending results... 12755 1727204122.10714: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12755 1727204122.10788: in run() - task 12b410aa-8751-72e9-1a19-0000000000da 12755 1727204122.10822: variable 'ansible_search_path' from source: unknown 12755 1727204122.10896: variable 'ansible_search_path' from source: unknown 12755 1727204122.10900: calling self._execute() 12755 1727204122.11004: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204122.11026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204122.11044: variable 'omit' from source: magic vars 12755 1727204122.11508: variable 'ansible_distribution_major_version' from source: facts 12755 1727204122.11530: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204122.11697: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204122.11972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204122.14680: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204122.14828: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204122.14842: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204122.14893: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204122.14938: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204122.15048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204122.15113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204122.15259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204122.15263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204122.15266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204122.15312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204122.15349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204122.15391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204122.15451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204122.15478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204122.15542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204122.15580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204122.15624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204122.15696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204122.15705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204122.15957: variable 'network_connections' from source: task vars 12755 1727204122.16026: variable 'controller_profile' from source: play vars 12755 1727204122.16077: variable 'controller_profile' from source: play vars 12755 1727204122.16096: variable 'controller_device' from source: play vars 12755 1727204122.16182: variable 'controller_device' from source: play vars 12755 1727204122.16201: variable 'port1_profile' from source: play vars 12755 1727204122.16284: variable 'port1_profile' from source: play vars 12755 1727204122.16301: variable 'dhcp_interface1' from source: play vars 12755 1727204122.16384: variable 'dhcp_interface1' from source: play vars 12755 1727204122.16460: variable 'controller_profile' from source: play vars 12755 1727204122.16480: variable 'controller_profile' from source: play vars 12755 1727204122.16496: variable 'port2_profile' from source: play vars 12755 1727204122.16577: variable 'port2_profile' from source: play vars 12755 1727204122.16594: variable 'dhcp_interface2' from source: play vars 12755 1727204122.16674: variable 'dhcp_interface2' from source: play vars 12755 1727204122.16692: variable 'controller_profile' from source: play vars 12755 1727204122.16770: variable 'controller_profile' from source: play vars 12755 1727204122.16870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204122.17095: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204122.17294: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204122.17297: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204122.17299: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204122.17301: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204122.17315: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204122.17346: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204122.17376: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204122.17471: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204122.17823: variable 'network_connections' from source: task vars 12755 1727204122.17833: variable 'controller_profile' from source: play vars 12755 1727204122.17914: variable 'controller_profile' from source: play vars 12755 1727204122.17929: variable 'controller_device' from source: play vars 12755 1727204122.18013: variable 'controller_device' from source: play vars 12755 1727204122.18029: variable 'port1_profile' from source: play vars 12755 1727204122.18112: variable 'port1_profile' from source: play vars 12755 1727204122.18126: variable 'dhcp_interface1' from source: play vars 12755 1727204122.18205: variable 'dhcp_interface1' from source: play vars 12755 1727204122.18221: variable 'controller_profile' from source: play vars 12755 1727204122.18396: variable 'controller_profile' from source: play vars 12755 1727204122.18399: variable 'port2_profile' from source: play vars 12755 1727204122.18402: variable 'port2_profile' from source: play vars 12755 1727204122.18404: variable 'dhcp_interface2' from source: play vars 12755 1727204122.18476: variable 'dhcp_interface2' from source: play vars 12755 1727204122.18490: variable 'controller_profile' from source: play vars 12755 1727204122.18571: variable 'controller_profile' from source: play vars 12755 1727204122.18622: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12755 1727204122.18631: when evaluation is False, skipping this task 12755 1727204122.18639: _execute() done 12755 1727204122.18647: dumping result to json 12755 1727204122.18654: done dumping result, returning 12755 1727204122.18667: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-72e9-1a19-0000000000da] 12755 1727204122.18677: sending task result for task 12b410aa-8751-72e9-1a19-0000000000da skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12755 1727204122.18885: no more pending results, returning what we have 12755 1727204122.18891: results queue empty 12755 1727204122.18892: checking for any_errors_fatal 12755 1727204122.18902: done checking for any_errors_fatal 12755 1727204122.18903: checking for max_fail_percentage 12755 1727204122.18905: done checking for max_fail_percentage 12755 1727204122.18906: checking to see if all hosts have failed and the running result is not ok 12755 1727204122.18907: done checking to see if all hosts have failed 12755 1727204122.18910: getting the remaining hosts for this loop 12755 1727204122.18912: done getting the remaining hosts for this loop 12755 1727204122.18918: getting the next task for host managed-node1 12755 1727204122.18927: done getting next task for host managed-node1 12755 1727204122.18932: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12755 1727204122.18936: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204122.18961: getting variables 12755 1727204122.18963: in VariableManager get_vars() 12755 1727204122.19133: Calling all_inventory to load vars for managed-node1 12755 1727204122.19137: Calling groups_inventory to load vars for managed-node1 12755 1727204122.19141: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204122.19384: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000da 12755 1727204122.19388: WORKER PROCESS EXITING 12755 1727204122.19401: Calling all_plugins_play to load vars for managed-node1 12755 1727204122.19406: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204122.19413: Calling groups_plugins_play to load vars for managed-node1 12755 1727204122.21586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204122.24558: done with get_vars() 12755 1727204122.24611: done getting variables 12755 1727204122.24690: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:55:22 -0400 (0:00:00.148) 0:00:47.483 ***** 12755 1727204122.24734: entering _queue_task() for managed-node1/service 12755 1727204122.25123: worker is 1 (out of 1 available) 12755 1727204122.25138: exiting _queue_task() for managed-node1/service 12755 1727204122.25151: done queuing things up, now waiting for results queue to drain 12755 1727204122.25153: waiting for pending results... 12755 1727204122.25488: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12755 1727204122.25784: in run() - task 12b410aa-8751-72e9-1a19-0000000000db 12755 1727204122.25950: variable 'ansible_search_path' from source: unknown 12755 1727204122.25964: variable 'ansible_search_path' from source: unknown 12755 1727204122.26091: calling self._execute() 12755 1727204122.26159: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204122.26173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204122.26197: variable 'omit' from source: magic vars 12755 1727204122.26665: variable 'ansible_distribution_major_version' from source: facts 12755 1727204122.26686: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204122.26920: variable 'network_provider' from source: set_fact 12755 1727204122.26933: variable 'network_state' from source: role '' defaults 12755 1727204122.26955: Evaluated conditional (network_provider == "nm" or network_state != {}): True 12755 1727204122.26967: variable 'omit' from source: magic vars 12755 1727204122.27043: variable 'omit' from source: magic vars 12755 1727204122.27169: variable 'network_service_name' from source: role '' defaults 12755 1727204122.27173: variable 'network_service_name' from source: role '' defaults 12755 1727204122.27319: variable '__network_provider_setup' from source: role '' defaults 12755 1727204122.27330: variable '__network_service_name_default_nm' from source: role '' defaults 12755 1727204122.27411: variable '__network_service_name_default_nm' from source: role '' defaults 12755 1727204122.27427: variable '__network_packages_default_nm' from source: role '' defaults 12755 1727204122.27497: variable '__network_packages_default_nm' from source: role '' defaults 12755 1727204122.27857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204122.30998: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204122.31003: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204122.31054: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204122.31105: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204122.31151: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204122.31259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204122.31304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204122.31349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204122.31413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204122.31441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204122.31505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204122.31594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204122.31597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204122.31639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204122.31666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204122.32019: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12755 1727204122.32182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204122.32226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204122.32260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204122.32324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204122.32394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204122.32471: variable 'ansible_python' from source: facts 12755 1727204122.32505: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12755 1727204122.32623: variable '__network_wpa_supplicant_required' from source: role '' defaults 12755 1727204122.32734: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12755 1727204122.32919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204122.32953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204122.32995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204122.33071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204122.33077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204122.33145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204122.33289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204122.33292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204122.33298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204122.33306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204122.33493: variable 'network_connections' from source: task vars 12755 1727204122.33515: variable 'controller_profile' from source: play vars 12755 1727204122.33613: variable 'controller_profile' from source: play vars 12755 1727204122.33635: variable 'controller_device' from source: play vars 12755 1727204122.33732: variable 'controller_device' from source: play vars 12755 1727204122.33751: variable 'port1_profile' from source: play vars 12755 1727204122.33848: variable 'port1_profile' from source: play vars 12755 1727204122.33865: variable 'dhcp_interface1' from source: play vars 12755 1727204122.33965: variable 'dhcp_interface1' from source: play vars 12755 1727204122.33981: variable 'controller_profile' from source: play vars 12755 1727204122.34078: variable 'controller_profile' from source: play vars 12755 1727204122.34101: variable 'port2_profile' from source: play vars 12755 1727204122.34273: variable 'port2_profile' from source: play vars 12755 1727204122.34277: variable 'dhcp_interface2' from source: play vars 12755 1727204122.34314: variable 'dhcp_interface2' from source: play vars 12755 1727204122.34332: variable 'controller_profile' from source: play vars 12755 1727204122.34430: variable 'controller_profile' from source: play vars 12755 1727204122.34568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204122.34819: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204122.34883: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204122.34951: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204122.35004: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204122.35103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204122.35154: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204122.35199: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204122.35295: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204122.35321: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204122.35712: variable 'network_connections' from source: task vars 12755 1727204122.35725: variable 'controller_profile' from source: play vars 12755 1727204122.35824: variable 'controller_profile' from source: play vars 12755 1727204122.35845: variable 'controller_device' from source: play vars 12755 1727204122.35938: variable 'controller_device' from source: play vars 12755 1727204122.35963: variable 'port1_profile' from source: play vars 12755 1727204122.36197: variable 'port1_profile' from source: play vars 12755 1727204122.36201: variable 'dhcp_interface1' from source: play vars 12755 1727204122.36203: variable 'dhcp_interface1' from source: play vars 12755 1727204122.36205: variable 'controller_profile' from source: play vars 12755 1727204122.36270: variable 'controller_profile' from source: play vars 12755 1727204122.36288: variable 'port2_profile' from source: play vars 12755 1727204122.36386: variable 'port2_profile' from source: play vars 12755 1727204122.36406: variable 'dhcp_interface2' from source: play vars 12755 1727204122.36500: variable 'dhcp_interface2' from source: play vars 12755 1727204122.36519: variable 'controller_profile' from source: play vars 12755 1727204122.36615: variable 'controller_profile' from source: play vars 12755 1727204122.36681: variable '__network_packages_default_wireless' from source: role '' defaults 12755 1727204122.36995: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204122.37796: variable 'network_connections' from source: task vars 12755 1727204122.37799: variable 'controller_profile' from source: play vars 12755 1727204122.37931: variable 'controller_profile' from source: play vars 12755 1727204122.37962: variable 'controller_device' from source: play vars 12755 1727204122.38102: variable 'controller_device' from source: play vars 12755 1727204122.38155: variable 'port1_profile' from source: play vars 12755 1727204122.38260: variable 'port1_profile' from source: play vars 12755 1727204122.38279: variable 'dhcp_interface1' from source: play vars 12755 1727204122.38388: variable 'dhcp_interface1' from source: play vars 12755 1727204122.38406: variable 'controller_profile' from source: play vars 12755 1727204122.38501: variable 'controller_profile' from source: play vars 12755 1727204122.38519: variable 'port2_profile' from source: play vars 12755 1727204122.38616: variable 'port2_profile' from source: play vars 12755 1727204122.38630: variable 'dhcp_interface2' from source: play vars 12755 1727204122.38725: variable 'dhcp_interface2' from source: play vars 12755 1727204122.38827: variable 'controller_profile' from source: play vars 12755 1727204122.38834: variable 'controller_profile' from source: play vars 12755 1727204122.38870: variable '__network_packages_default_team' from source: role '' defaults 12755 1727204122.38981: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204122.39413: variable 'network_connections' from source: task vars 12755 1727204122.39426: variable 'controller_profile' from source: play vars 12755 1727204122.39523: variable 'controller_profile' from source: play vars 12755 1727204122.39537: variable 'controller_device' from source: play vars 12755 1727204122.39633: variable 'controller_device' from source: play vars 12755 1727204122.39649: variable 'port1_profile' from source: play vars 12755 1727204122.39741: variable 'port1_profile' from source: play vars 12755 1727204122.39756: variable 'dhcp_interface1' from source: play vars 12755 1727204122.39849: variable 'dhcp_interface1' from source: play vars 12755 1727204122.39862: variable 'controller_profile' from source: play vars 12755 1727204122.39960: variable 'controller_profile' from source: play vars 12755 1727204122.39973: variable 'port2_profile' from source: play vars 12755 1727204122.40136: variable 'port2_profile' from source: play vars 12755 1727204122.40139: variable 'dhcp_interface2' from source: play vars 12755 1727204122.40363: variable 'dhcp_interface2' from source: play vars 12755 1727204122.40366: variable 'controller_profile' from source: play vars 12755 1727204122.40413: variable 'controller_profile' from source: play vars 12755 1727204122.40505: variable '__network_service_name_default_initscripts' from source: role '' defaults 12755 1727204122.40583: variable '__network_service_name_default_initscripts' from source: role '' defaults 12755 1727204122.40599: variable '__network_packages_default_initscripts' from source: role '' defaults 12755 1727204122.40678: variable '__network_packages_default_initscripts' from source: role '' defaults 12755 1727204122.40978: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12755 1727204122.41656: variable 'network_connections' from source: task vars 12755 1727204122.41672: variable 'controller_profile' from source: play vars 12755 1727204122.41752: variable 'controller_profile' from source: play vars 12755 1727204122.41765: variable 'controller_device' from source: play vars 12755 1727204122.41850: variable 'controller_device' from source: play vars 12755 1727204122.41867: variable 'port1_profile' from source: play vars 12755 1727204122.41951: variable 'port1_profile' from source: play vars 12755 1727204122.41964: variable 'dhcp_interface1' from source: play vars 12755 1727204122.42046: variable 'dhcp_interface1' from source: play vars 12755 1727204122.42059: variable 'controller_profile' from source: play vars 12755 1727204122.42140: variable 'controller_profile' from source: play vars 12755 1727204122.42218: variable 'port2_profile' from source: play vars 12755 1727204122.42236: variable 'port2_profile' from source: play vars 12755 1727204122.42249: variable 'dhcp_interface2' from source: play vars 12755 1727204122.42330: variable 'dhcp_interface2' from source: play vars 12755 1727204122.42345: variable 'controller_profile' from source: play vars 12755 1727204122.42423: variable 'controller_profile' from source: play vars 12755 1727204122.42441: variable 'ansible_distribution' from source: facts 12755 1727204122.42450: variable '__network_rh_distros' from source: role '' defaults 12755 1727204122.42462: variable 'ansible_distribution_major_version' from source: facts 12755 1727204122.42498: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12755 1727204122.42747: variable 'ansible_distribution' from source: facts 12755 1727204122.42762: variable '__network_rh_distros' from source: role '' defaults 12755 1727204122.42772: variable 'ansible_distribution_major_version' from source: facts 12755 1727204122.42783: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12755 1727204122.43096: variable 'ansible_distribution' from source: facts 12755 1727204122.43099: variable '__network_rh_distros' from source: role '' defaults 12755 1727204122.43101: variable 'ansible_distribution_major_version' from source: facts 12755 1727204122.43103: variable 'network_provider' from source: set_fact 12755 1727204122.43129: variable 'omit' from source: magic vars 12755 1727204122.43171: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204122.43222: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204122.43251: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204122.43277: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204122.43297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204122.43394: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204122.43397: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204122.43400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204122.43500: Set connection var ansible_connection to ssh 12755 1727204122.43516: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204122.43524: Set connection var ansible_shell_type to sh 12755 1727204122.43547: Set connection var ansible_timeout to 10 12755 1727204122.43559: Set connection var ansible_shell_executable to /bin/sh 12755 1727204122.43572: Set connection var ansible_pipelining to False 12755 1727204122.43607: variable 'ansible_shell_executable' from source: unknown 12755 1727204122.43652: variable 'ansible_connection' from source: unknown 12755 1727204122.43655: variable 'ansible_module_compression' from source: unknown 12755 1727204122.43657: variable 'ansible_shell_type' from source: unknown 12755 1727204122.43659: variable 'ansible_shell_executable' from source: unknown 12755 1727204122.43661: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204122.43663: variable 'ansible_pipelining' from source: unknown 12755 1727204122.43665: variable 'ansible_timeout' from source: unknown 12755 1727204122.43667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204122.43804: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204122.43825: variable 'omit' from source: magic vars 12755 1727204122.43870: starting attempt loop 12755 1727204122.43873: running the handler 12755 1727204122.43951: variable 'ansible_facts' from source: unknown 12755 1727204122.45220: _low_level_execute_command(): starting 12755 1727204122.45237: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204122.46054: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204122.46071: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204122.46212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204122.46218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 12755 1727204122.46243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204122.46261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204122.46511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204122.48224: stdout chunk (state=3): >>>/root <<< 12755 1727204122.48496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204122.48500: stdout chunk (state=3): >>><<< 12755 1727204122.48502: stderr chunk (state=3): >>><<< 12755 1727204122.48524: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204122.48600: _low_level_execute_command(): starting 12755 1727204122.48604: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204122.4853601-15397-43573597368431 `" && echo ansible-tmp-1727204122.4853601-15397-43573597368431="` echo /root/.ansible/tmp/ansible-tmp-1727204122.4853601-15397-43573597368431 `" ) && sleep 0' 12755 1727204122.49916: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204122.49920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204122.49923: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204122.49925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204122.50185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204122.50192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204122.50374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204122.52597: stdout chunk (state=3): >>>ansible-tmp-1727204122.4853601-15397-43573597368431=/root/.ansible/tmp/ansible-tmp-1727204122.4853601-15397-43573597368431 <<< 12755 1727204122.52647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204122.52733: stderr chunk (state=3): >>><<< 12755 1727204122.52745: stdout chunk (state=3): >>><<< 12755 1727204122.52821: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204122.4853601-15397-43573597368431=/root/.ansible/tmp/ansible-tmp-1727204122.4853601-15397-43573597368431 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204122.53198: variable 'ansible_module_compression' from source: unknown 12755 1727204122.53201: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 12755 1727204122.53205: variable 'ansible_facts' from source: unknown 12755 1727204122.53578: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204122.4853601-15397-43573597368431/AnsiballZ_systemd.py 12755 1727204122.54090: Sending initial data 12755 1727204122.54101: Sent initial data (155 bytes) 12755 1727204122.54673: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204122.54692: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204122.54713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204122.54737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204122.54757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204122.54770: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204122.54783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204122.54806: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12755 1727204122.54844: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204122.54923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204122.54953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204122.54973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204122.55039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204122.56915: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204122.56948: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204122.57005: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmphib5k8dx /root/.ansible/tmp/ansible-tmp-1727204122.4853601-15397-43573597368431/AnsiballZ_systemd.py <<< 12755 1727204122.57038: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204122.4853601-15397-43573597368431/AnsiballZ_systemd.py" <<< 12755 1727204122.57082: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmphib5k8dx" to remote "/root/.ansible/tmp/ansible-tmp-1727204122.4853601-15397-43573597368431/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204122.4853601-15397-43573597368431/AnsiballZ_systemd.py" <<< 12755 1727204122.60023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204122.60297: stderr chunk (state=3): >>><<< 12755 1727204122.60301: stdout chunk (state=3): >>><<< 12755 1727204122.60303: done transferring module to remote 12755 1727204122.60306: _low_level_execute_command(): starting 12755 1727204122.60308: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204122.4853601-15397-43573597368431/ /root/.ansible/tmp/ansible-tmp-1727204122.4853601-15397-43573597368431/AnsiballZ_systemd.py && sleep 0' 12755 1727204122.61101: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204122.61122: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204122.61160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204122.61206: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204122.61295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204122.61321: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204122.61369: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204122.61404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204122.63574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204122.63611: stdout chunk (state=3): >>><<< 12755 1727204122.63623: stderr chunk (state=3): >>><<< 12755 1727204122.63749: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204122.63753: _low_level_execute_command(): starting 12755 1727204122.63755: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204122.4853601-15397-43573597368431/AnsiballZ_systemd.py && sleep 0' 12755 1727204122.64841: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204122.64877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204122.65160: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204122.65195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204122.65284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204122.99228: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "651", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ExecMainStartTimestampMonotonic": "17567139", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "651", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "12115968", "MemoryAvailable": "infinity", "CPUUsageNSec": "1024368000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "in<<< 12755 1727204122.99378: stdout chunk (state=3): >>>finity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service multi-user.target cloud-init.service NetworkManager-wait-online.service network.target shutdown.target", "After": "dbus-broker.service system.slice dbus.socket systemd-journald.socket basic.target sysinit.target network-pre.target cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:51 EDT", "StateChangeTimestampMonotonic": "521403753", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:27 EDT", "InactiveExitTimestampMonotonic": "17567399", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ActiveEnterTimestampMonotonic": "18019295", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ConditionTimestampMonotonic": "17554557", "AssertTimestamp": "Tue 2024-09-24 14:45:27 EDT", "AssertTimestampMonotonic": "17554559", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac0fd3fc06b14ac59a7d5e4a43cc5865", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 12755 1727204123.01800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204123.01804: stdout chunk (state=3): >>><<< 12755 1727204123.01807: stderr chunk (state=3): >>><<< 12755 1727204123.01813: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "651", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ExecMainStartTimestampMonotonic": "17567139", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "651", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "12115968", "MemoryAvailable": "infinity", "CPUUsageNSec": "1024368000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service multi-user.target cloud-init.service NetworkManager-wait-online.service network.target shutdown.target", "After": "dbus-broker.service system.slice dbus.socket systemd-journald.socket basic.target sysinit.target network-pre.target cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:51 EDT", "StateChangeTimestampMonotonic": "521403753", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:27 EDT", "InactiveExitTimestampMonotonic": "17567399", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ActiveEnterTimestampMonotonic": "18019295", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ConditionTimestampMonotonic": "17554557", "AssertTimestamp": "Tue 2024-09-24 14:45:27 EDT", "AssertTimestampMonotonic": "17554559", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac0fd3fc06b14ac59a7d5e4a43cc5865", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204123.02251: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204122.4853601-15397-43573597368431/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204123.02496: _low_level_execute_command(): starting 12755 1727204123.02500: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204122.4853601-15397-43573597368431/ > /dev/null 2>&1 && sleep 0' 12755 1727204123.03632: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204123.03684: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204123.03856: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204123.03899: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204123.04019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204123.04200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204123.06427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204123.06431: stdout chunk (state=3): >>><<< 12755 1727204123.06441: stderr chunk (state=3): >>><<< 12755 1727204123.06477: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204123.06486: handler run complete 12755 1727204123.06797: attempt loop complete, returning result 12755 1727204123.06801: _execute() done 12755 1727204123.06805: dumping result to json 12755 1727204123.06824: done dumping result, returning 12755 1727204123.06838: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-72e9-1a19-0000000000db] 12755 1727204123.06844: sending task result for task 12b410aa-8751-72e9-1a19-0000000000db 12755 1727204123.07339: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000db 12755 1727204123.07344: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204123.07411: no more pending results, returning what we have 12755 1727204123.07415: results queue empty 12755 1727204123.07417: checking for any_errors_fatal 12755 1727204123.07425: done checking for any_errors_fatal 12755 1727204123.07426: checking for max_fail_percentage 12755 1727204123.07428: done checking for max_fail_percentage 12755 1727204123.07429: checking to see if all hosts have failed and the running result is not ok 12755 1727204123.07430: done checking to see if all hosts have failed 12755 1727204123.07431: getting the remaining hosts for this loop 12755 1727204123.07433: done getting the remaining hosts for this loop 12755 1727204123.07438: getting the next task for host managed-node1 12755 1727204123.07447: done getting next task for host managed-node1 12755 1727204123.07451: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12755 1727204123.07455: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204123.07470: getting variables 12755 1727204123.07472: in VariableManager get_vars() 12755 1727204123.07538: Calling all_inventory to load vars for managed-node1 12755 1727204123.07542: Calling groups_inventory to load vars for managed-node1 12755 1727204123.07546: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204123.07559: Calling all_plugins_play to load vars for managed-node1 12755 1727204123.07562: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204123.07566: Calling groups_plugins_play to load vars for managed-node1 12755 1727204123.13078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204123.16363: done with get_vars() 12755 1727204123.16424: done getting variables 12755 1727204123.16531: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:55:23 -0400 (0:00:00.918) 0:00:48.401 ***** 12755 1727204123.16573: entering _queue_task() for managed-node1/service 12755 1727204123.17403: worker is 1 (out of 1 available) 12755 1727204123.17422: exiting _queue_task() for managed-node1/service 12755 1727204123.17437: done queuing things up, now waiting for results queue to drain 12755 1727204123.17439: waiting for pending results... 12755 1727204123.18160: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12755 1727204123.18351: in run() - task 12b410aa-8751-72e9-1a19-0000000000dc 12755 1727204123.18376: variable 'ansible_search_path' from source: unknown 12755 1727204123.18380: variable 'ansible_search_path' from source: unknown 12755 1727204123.18424: calling self._execute() 12755 1727204123.18556: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204123.18565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204123.18587: variable 'omit' from source: magic vars 12755 1727204123.19069: variable 'ansible_distribution_major_version' from source: facts 12755 1727204123.19195: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204123.19255: variable 'network_provider' from source: set_fact 12755 1727204123.19262: Evaluated conditional (network_provider == "nm"): True 12755 1727204123.19391: variable '__network_wpa_supplicant_required' from source: role '' defaults 12755 1727204123.19509: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12755 1727204123.19760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204123.23814: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204123.23904: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204123.23954: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204123.23999: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204123.24029: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204123.24136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204123.24174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204123.24205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204123.24251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204123.24277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204123.24495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204123.24501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204123.24505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204123.24532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204123.24550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204123.24607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204123.24636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204123.24666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204123.24723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204123.24740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204123.24986: variable 'network_connections' from source: task vars 12755 1727204123.25005: variable 'controller_profile' from source: play vars 12755 1727204123.25133: variable 'controller_profile' from source: play vars 12755 1727204123.25157: variable 'controller_device' from source: play vars 12755 1727204123.25232: variable 'controller_device' from source: play vars 12755 1727204123.25245: variable 'port1_profile' from source: play vars 12755 1727204123.25324: variable 'port1_profile' from source: play vars 12755 1727204123.25374: variable 'dhcp_interface1' from source: play vars 12755 1727204123.25414: variable 'dhcp_interface1' from source: play vars 12755 1727204123.25418: variable 'controller_profile' from source: play vars 12755 1727204123.25699: variable 'controller_profile' from source: play vars 12755 1727204123.25702: variable 'port2_profile' from source: play vars 12755 1727204123.25705: variable 'port2_profile' from source: play vars 12755 1727204123.25707: variable 'dhcp_interface2' from source: play vars 12755 1727204123.25713: variable 'dhcp_interface2' from source: play vars 12755 1727204123.25715: variable 'controller_profile' from source: play vars 12755 1727204123.25742: variable 'controller_profile' from source: play vars 12755 1727204123.25845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204123.26071: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204123.26118: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204123.26163: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204123.26199: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204123.26259: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204123.26289: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204123.26318: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204123.26359: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204123.26424: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204123.27055: variable 'network_connections' from source: task vars 12755 1727204123.27060: variable 'controller_profile' from source: play vars 12755 1727204123.27078: variable 'controller_profile' from source: play vars 12755 1727204123.27087: variable 'controller_device' from source: play vars 12755 1727204123.27163: variable 'controller_device' from source: play vars 12755 1727204123.27173: variable 'port1_profile' from source: play vars 12755 1727204123.27254: variable 'port1_profile' from source: play vars 12755 1727204123.27268: variable 'dhcp_interface1' from source: play vars 12755 1727204123.27348: variable 'dhcp_interface1' from source: play vars 12755 1727204123.27352: variable 'controller_profile' from source: play vars 12755 1727204123.27446: variable 'controller_profile' from source: play vars 12755 1727204123.27450: variable 'port2_profile' from source: play vars 12755 1727204123.27527: variable 'port2_profile' from source: play vars 12755 1727204123.27712: variable 'dhcp_interface2' from source: play vars 12755 1727204123.27778: variable 'dhcp_interface2' from source: play vars 12755 1727204123.27785: variable 'controller_profile' from source: play vars 12755 1727204123.28047: variable 'controller_profile' from source: play vars 12755 1727204123.28244: Evaluated conditional (__network_wpa_supplicant_required): False 12755 1727204123.28248: when evaluation is False, skipping this task 12755 1727204123.28251: _execute() done 12755 1727204123.28256: dumping result to json 12755 1727204123.28263: done dumping result, returning 12755 1727204123.28294: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-72e9-1a19-0000000000dc] 12755 1727204123.28298: sending task result for task 12b410aa-8751-72e9-1a19-0000000000dc 12755 1727204123.28399: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000dc 12755 1727204123.28402: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 12755 1727204123.28469: no more pending results, returning what we have 12755 1727204123.28474: results queue empty 12755 1727204123.28475: checking for any_errors_fatal 12755 1727204123.28505: done checking for any_errors_fatal 12755 1727204123.28507: checking for max_fail_percentage 12755 1727204123.28509: done checking for max_fail_percentage 12755 1727204123.28510: checking to see if all hosts have failed and the running result is not ok 12755 1727204123.28511: done checking to see if all hosts have failed 12755 1727204123.28512: getting the remaining hosts for this loop 12755 1727204123.28514: done getting the remaining hosts for this loop 12755 1727204123.28520: getting the next task for host managed-node1 12755 1727204123.28528: done getting next task for host managed-node1 12755 1727204123.28534: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12755 1727204123.28537: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204123.28564: getting variables 12755 1727204123.28567: in VariableManager get_vars() 12755 1727204123.28845: Calling all_inventory to load vars for managed-node1 12755 1727204123.28849: Calling groups_inventory to load vars for managed-node1 12755 1727204123.28853: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204123.28867: Calling all_plugins_play to load vars for managed-node1 12755 1727204123.28871: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204123.28876: Calling groups_plugins_play to load vars for managed-node1 12755 1727204123.31802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204123.35684: done with get_vars() 12755 1727204123.35739: done getting variables 12755 1727204123.35818: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:55:23 -0400 (0:00:00.192) 0:00:48.594 ***** 12755 1727204123.35857: entering _queue_task() for managed-node1/service 12755 1727204123.36236: worker is 1 (out of 1 available) 12755 1727204123.36253: exiting _queue_task() for managed-node1/service 12755 1727204123.36267: done queuing things up, now waiting for results queue to drain 12755 1727204123.36269: waiting for pending results... 12755 1727204123.36722: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 12755 1727204123.36731: in run() - task 12b410aa-8751-72e9-1a19-0000000000dd 12755 1727204123.36735: variable 'ansible_search_path' from source: unknown 12755 1727204123.36738: variable 'ansible_search_path' from source: unknown 12755 1727204123.36774: calling self._execute() 12755 1727204123.36902: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204123.36924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204123.36942: variable 'omit' from source: magic vars 12755 1727204123.37415: variable 'ansible_distribution_major_version' from source: facts 12755 1727204123.37435: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204123.37599: variable 'network_provider' from source: set_fact 12755 1727204123.37614: Evaluated conditional (network_provider == "initscripts"): False 12755 1727204123.37623: when evaluation is False, skipping this task 12755 1727204123.37630: _execute() done 12755 1727204123.37639: dumping result to json 12755 1727204123.37646: done dumping result, returning 12755 1727204123.37658: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-72e9-1a19-0000000000dd] 12755 1727204123.37690: sending task result for task 12b410aa-8751-72e9-1a19-0000000000dd skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204123.37838: no more pending results, returning what we have 12755 1727204123.37843: results queue empty 12755 1727204123.37844: checking for any_errors_fatal 12755 1727204123.37854: done checking for any_errors_fatal 12755 1727204123.37855: checking for max_fail_percentage 12755 1727204123.37857: done checking for max_fail_percentage 12755 1727204123.37858: checking to see if all hosts have failed and the running result is not ok 12755 1727204123.37859: done checking to see if all hosts have failed 12755 1727204123.37860: getting the remaining hosts for this loop 12755 1727204123.37861: done getting the remaining hosts for this loop 12755 1727204123.37866: getting the next task for host managed-node1 12755 1727204123.37875: done getting next task for host managed-node1 12755 1727204123.37880: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12755 1727204123.37884: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204123.37904: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000dd 12755 1727204123.37908: WORKER PROCESS EXITING 12755 1727204123.38013: getting variables 12755 1727204123.38016: in VariableManager get_vars() 12755 1727204123.38073: Calling all_inventory to load vars for managed-node1 12755 1727204123.38076: Calling groups_inventory to load vars for managed-node1 12755 1727204123.38079: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204123.38094: Calling all_plugins_play to load vars for managed-node1 12755 1727204123.38097: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204123.38101: Calling groups_plugins_play to load vars for managed-node1 12755 1727204123.40204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204123.43118: done with get_vars() 12755 1727204123.43171: done getting variables 12755 1727204123.43253: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:55:23 -0400 (0:00:00.074) 0:00:48.668 ***** 12755 1727204123.43302: entering _queue_task() for managed-node1/copy 12755 1727204123.43687: worker is 1 (out of 1 available) 12755 1727204123.43806: exiting _queue_task() for managed-node1/copy 12755 1727204123.43819: done queuing things up, now waiting for results queue to drain 12755 1727204123.43821: waiting for pending results... 12755 1727204123.44052: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12755 1727204123.44278: in run() - task 12b410aa-8751-72e9-1a19-0000000000de 12755 1727204123.44282: variable 'ansible_search_path' from source: unknown 12755 1727204123.44285: variable 'ansible_search_path' from source: unknown 12755 1727204123.44306: calling self._execute() 12755 1727204123.44433: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204123.44449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204123.44597: variable 'omit' from source: magic vars 12755 1727204123.44931: variable 'ansible_distribution_major_version' from source: facts 12755 1727204123.44952: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204123.45110: variable 'network_provider' from source: set_fact 12755 1727204123.45124: Evaluated conditional (network_provider == "initscripts"): False 12755 1727204123.45134: when evaluation is False, skipping this task 12755 1727204123.45145: _execute() done 12755 1727204123.45157: dumping result to json 12755 1727204123.45166: done dumping result, returning 12755 1727204123.45180: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-72e9-1a19-0000000000de] 12755 1727204123.45195: sending task result for task 12b410aa-8751-72e9-1a19-0000000000de skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12755 1727204123.45369: no more pending results, returning what we have 12755 1727204123.45374: results queue empty 12755 1727204123.45375: checking for any_errors_fatal 12755 1727204123.45383: done checking for any_errors_fatal 12755 1727204123.45384: checking for max_fail_percentage 12755 1727204123.45387: done checking for max_fail_percentage 12755 1727204123.45388: checking to see if all hosts have failed and the running result is not ok 12755 1727204123.45391: done checking to see if all hosts have failed 12755 1727204123.45392: getting the remaining hosts for this loop 12755 1727204123.45394: done getting the remaining hosts for this loop 12755 1727204123.45498: getting the next task for host managed-node1 12755 1727204123.45508: done getting next task for host managed-node1 12755 1727204123.45514: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12755 1727204123.45519: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204123.45551: getting variables 12755 1727204123.45554: in VariableManager get_vars() 12755 1727204123.45819: Calling all_inventory to load vars for managed-node1 12755 1727204123.45823: Calling groups_inventory to load vars for managed-node1 12755 1727204123.45826: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204123.45838: Calling all_plugins_play to load vars for managed-node1 12755 1727204123.45841: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204123.45845: Calling groups_plugins_play to load vars for managed-node1 12755 1727204123.46538: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000de 12755 1727204123.46542: WORKER PROCESS EXITING 12755 1727204123.53341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204123.57440: done with get_vars() 12755 1727204123.57491: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:55:23 -0400 (0:00:00.142) 0:00:48.811 ***** 12755 1727204123.57581: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 12755 1727204123.57947: worker is 1 (out of 1 available) 12755 1727204123.57962: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 12755 1727204123.57975: done queuing things up, now waiting for results queue to drain 12755 1727204123.57978: waiting for pending results... 12755 1727204123.58305: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12755 1727204123.58495: in run() - task 12b410aa-8751-72e9-1a19-0000000000df 12755 1727204123.58519: variable 'ansible_search_path' from source: unknown 12755 1727204123.58533: variable 'ansible_search_path' from source: unknown 12755 1727204123.58582: calling self._execute() 12755 1727204123.58711: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204123.58727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204123.58750: variable 'omit' from source: magic vars 12755 1727204123.59345: variable 'ansible_distribution_major_version' from source: facts 12755 1727204123.59387: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204123.59423: variable 'omit' from source: magic vars 12755 1727204123.59667: variable 'omit' from source: magic vars 12755 1727204123.59985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204123.62830: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204123.62924: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204123.62977: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204123.63027: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204123.63067: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204123.63170: variable 'network_provider' from source: set_fact 12755 1727204123.63340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204123.63383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204123.63422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204123.63484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204123.63508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204123.63606: variable 'omit' from source: magic vars 12755 1727204123.63754: variable 'omit' from source: magic vars 12755 1727204123.63894: variable 'network_connections' from source: task vars 12755 1727204123.63920: variable 'controller_profile' from source: play vars 12755 1727204123.64000: variable 'controller_profile' from source: play vars 12755 1727204123.64024: variable 'controller_device' from source: play vars 12755 1727204123.64134: variable 'controller_device' from source: play vars 12755 1727204123.64137: variable 'port1_profile' from source: play vars 12755 1727204123.64188: variable 'port1_profile' from source: play vars 12755 1727204123.64204: variable 'dhcp_interface1' from source: play vars 12755 1727204123.64283: variable 'dhcp_interface1' from source: play vars 12755 1727204123.64298: variable 'controller_profile' from source: play vars 12755 1727204123.64380: variable 'controller_profile' from source: play vars 12755 1727204123.64460: variable 'port2_profile' from source: play vars 12755 1727204123.64478: variable 'port2_profile' from source: play vars 12755 1727204123.64493: variable 'dhcp_interface2' from source: play vars 12755 1727204123.64570: variable 'dhcp_interface2' from source: play vars 12755 1727204123.64583: variable 'controller_profile' from source: play vars 12755 1727204123.64659: variable 'controller_profile' from source: play vars 12755 1727204123.64912: variable 'omit' from source: magic vars 12755 1727204123.64927: variable '__lsr_ansible_managed' from source: task vars 12755 1727204123.65007: variable '__lsr_ansible_managed' from source: task vars 12755 1727204123.65243: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 12755 1727204123.65539: Loaded config def from plugin (lookup/template) 12755 1727204123.65659: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 12755 1727204123.65662: File lookup term: get_ansible_managed.j2 12755 1727204123.65665: variable 'ansible_search_path' from source: unknown 12755 1727204123.65668: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 12755 1727204123.65672: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 12755 1727204123.65675: variable 'ansible_search_path' from source: unknown 12755 1727204123.79415: variable 'ansible_managed' from source: unknown 12755 1727204123.79852: variable 'omit' from source: magic vars 12755 1727204123.79895: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204123.79988: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204123.79994: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204123.79996: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204123.80010: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204123.80044: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204123.80048: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204123.80052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204123.80193: Set connection var ansible_connection to ssh 12755 1727204123.80206: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204123.80212: Set connection var ansible_shell_type to sh 12755 1727204123.80294: Set connection var ansible_timeout to 10 12755 1727204123.80298: Set connection var ansible_shell_executable to /bin/sh 12755 1727204123.80313: Set connection var ansible_pipelining to False 12755 1727204123.80317: variable 'ansible_shell_executable' from source: unknown 12755 1727204123.80321: variable 'ansible_connection' from source: unknown 12755 1727204123.80326: variable 'ansible_module_compression' from source: unknown 12755 1727204123.80329: variable 'ansible_shell_type' from source: unknown 12755 1727204123.80331: variable 'ansible_shell_executable' from source: unknown 12755 1727204123.80333: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204123.80335: variable 'ansible_pipelining' from source: unknown 12755 1727204123.80337: variable 'ansible_timeout' from source: unknown 12755 1727204123.80340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204123.80481: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204123.80495: variable 'omit' from source: magic vars 12755 1727204123.80504: starting attempt loop 12755 1727204123.80507: running the handler 12755 1727204123.80544: _low_level_execute_command(): starting 12755 1727204123.80548: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204123.81591: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204123.81635: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 12755 1727204123.81644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204123.81658: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204123.81735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204123.83665: stdout chunk (state=3): >>>/root <<< 12755 1727204123.83775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204123.83893: stderr chunk (state=3): >>><<< 12755 1727204123.83896: stdout chunk (state=3): >>><<< 12755 1727204123.83918: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204123.83938: _low_level_execute_command(): starting 12755 1727204123.83952: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204123.8392618-15441-195826925464115 `" && echo ansible-tmp-1727204123.8392618-15441-195826925464115="` echo /root/.ansible/tmp/ansible-tmp-1727204123.8392618-15441-195826925464115 `" ) && sleep 0' 12755 1727204123.84592: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204123.84609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204123.84625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204123.84650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204123.84666: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204123.84678: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204123.84708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12755 1727204123.84758: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204123.84824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204123.84840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204123.84868: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204123.84946: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204123.87220: stdout chunk (state=3): >>>ansible-tmp-1727204123.8392618-15441-195826925464115=/root/.ansible/tmp/ansible-tmp-1727204123.8392618-15441-195826925464115 <<< 12755 1727204123.87687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204123.87694: stdout chunk (state=3): >>><<< 12755 1727204123.87697: stderr chunk (state=3): >>><<< 12755 1727204123.87700: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204123.8392618-15441-195826925464115=/root/.ansible/tmp/ansible-tmp-1727204123.8392618-15441-195826925464115 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204123.87707: variable 'ansible_module_compression' from source: unknown 12755 1727204123.87709: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 12755 1727204123.87711: variable 'ansible_facts' from source: unknown 12755 1727204123.87867: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204123.8392618-15441-195826925464115/AnsiballZ_network_connections.py 12755 1727204123.88072: Sending initial data 12755 1727204123.88084: Sent initial data (168 bytes) 12755 1727204123.88692: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204123.88714: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204123.88807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204123.88845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204123.88862: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204123.88886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204123.88973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204123.90797: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 12755 1727204123.90807: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204123.91094: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204123.8392618-15441-195826925464115/AnsiballZ_network_connections.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpy_ws52ls" to remote "/root/.ansible/tmp/ansible-tmp-1727204123.8392618-15441-195826925464115/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204123.8392618-15441-195826925464115/AnsiballZ_network_connections.py" <<< 12755 1727204123.91098: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpy_ws52ls /root/.ansible/tmp/ansible-tmp-1727204123.8392618-15441-195826925464115/AnsiballZ_network_connections.py <<< 12755 1727204123.92767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204123.92865: stderr chunk (state=3): >>><<< 12755 1727204123.92877: stdout chunk (state=3): >>><<< 12755 1727204123.92911: done transferring module to remote 12755 1727204123.92954: _low_level_execute_command(): starting 12755 1727204123.92961: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204123.8392618-15441-195826925464115/ /root/.ansible/tmp/ansible-tmp-1727204123.8392618-15441-195826925464115/AnsiballZ_network_connections.py && sleep 0' 12755 1727204123.93548: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204123.93619: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204123.93627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204123.93631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204123.93633: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204123.93635: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204123.93641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204123.93715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204123.93728: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204123.93753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204123.93864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204123.95897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204123.95956: stderr chunk (state=3): >>><<< 12755 1727204123.95966: stdout chunk (state=3): >>><<< 12755 1727204123.95992: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204123.96002: _low_level_execute_command(): starting 12755 1727204123.96015: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204123.8392618-15441-195826925464115/AnsiballZ_network_connections.py && sleep 0' 12755 1727204123.96692: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204123.96708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204123.96726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204123.96758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204123.96880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 12755 1727204123.96894: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204123.96914: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204123.97008: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204124.55073: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 03d70ce0-ddeb-47d0-bf95-c6e32f95cb44\n[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c\n[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 03d70ce0-ddeb-47d0-bf95-c6e32f95cb44 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 12755 1727204124.57501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204124.57505: stdout chunk (state=3): >>><<< 12755 1727204124.57508: stderr chunk (state=3): >>><<< 12755 1727204124.57518: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 03d70ce0-ddeb-47d0-bf95-c6e32f95cb44\n[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c\n[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 03d70ce0-ddeb-47d0-bf95-c6e32f95cb44 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204124.57627: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204123.8392618-15441-195826925464115/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204124.57696: _low_level_execute_command(): starting 12755 1727204124.57700: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204123.8392618-15441-195826925464115/ > /dev/null 2>&1 && sleep 0' 12755 1727204124.58372: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204124.58386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204124.58441: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204124.58457: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204124.58504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204124.60697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204124.60701: stdout chunk (state=3): >>><<< 12755 1727204124.60703: stderr chunk (state=3): >>><<< 12755 1727204124.60706: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204124.60711: handler run complete 12755 1727204124.60713: attempt loop complete, returning result 12755 1727204124.60715: _execute() done 12755 1727204124.60717: dumping result to json 12755 1727204124.60719: done dumping result, returning 12755 1727204124.60721: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-72e9-1a19-0000000000df] 12755 1727204124.60723: sending task result for task 12b410aa-8751-72e9-1a19-0000000000df 12755 1727204124.61195: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000df 12755 1727204124.61198: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 03d70ce0-ddeb-47d0-bf95-c6e32f95cb44 [008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c [009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 03d70ce0-ddeb-47d0-bf95-c6e32f95cb44 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38 (not-active) 12755 1727204124.61372: no more pending results, returning what we have 12755 1727204124.61376: results queue empty 12755 1727204124.61377: checking for any_errors_fatal 12755 1727204124.61383: done checking for any_errors_fatal 12755 1727204124.61384: checking for max_fail_percentage 12755 1727204124.61385: done checking for max_fail_percentage 12755 1727204124.61386: checking to see if all hosts have failed and the running result is not ok 12755 1727204124.61387: done checking to see if all hosts have failed 12755 1727204124.61388: getting the remaining hosts for this loop 12755 1727204124.61490: done getting the remaining hosts for this loop 12755 1727204124.61497: getting the next task for host managed-node1 12755 1727204124.61504: done getting next task for host managed-node1 12755 1727204124.61508: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12755 1727204124.61511: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204124.61527: getting variables 12755 1727204124.61529: in VariableManager get_vars() 12755 1727204124.61592: Calling all_inventory to load vars for managed-node1 12755 1727204124.61595: Calling groups_inventory to load vars for managed-node1 12755 1727204124.61598: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204124.61609: Calling all_plugins_play to load vars for managed-node1 12755 1727204124.61613: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204124.61617: Calling groups_plugins_play to load vars for managed-node1 12755 1727204124.64004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204124.68199: done with get_vars() 12755 1727204124.68249: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:55:24 -0400 (0:00:01.107) 0:00:49.919 ***** 12755 1727204124.68377: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 12755 1727204124.69075: worker is 1 (out of 1 available) 12755 1727204124.69097: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 12755 1727204124.69111: done queuing things up, now waiting for results queue to drain 12755 1727204124.69113: waiting for pending results... 12755 1727204124.69717: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 12755 1727204124.70165: in run() - task 12b410aa-8751-72e9-1a19-0000000000e0 12755 1727204124.70169: variable 'ansible_search_path' from source: unknown 12755 1727204124.70173: variable 'ansible_search_path' from source: unknown 12755 1727204124.70216: calling self._execute() 12755 1727204124.70512: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204124.70522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204124.70534: variable 'omit' from source: magic vars 12755 1727204124.71536: variable 'ansible_distribution_major_version' from source: facts 12755 1727204124.71609: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204124.71996: variable 'network_state' from source: role '' defaults 12755 1727204124.72012: Evaluated conditional (network_state != {}): False 12755 1727204124.72018: when evaluation is False, skipping this task 12755 1727204124.72022: _execute() done 12755 1727204124.72027: dumping result to json 12755 1727204124.72031: done dumping result, returning 12755 1727204124.72042: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-72e9-1a19-0000000000e0] 12755 1727204124.72048: sending task result for task 12b410aa-8751-72e9-1a19-0000000000e0 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204124.72232: no more pending results, returning what we have 12755 1727204124.72237: results queue empty 12755 1727204124.72238: checking for any_errors_fatal 12755 1727204124.72251: done checking for any_errors_fatal 12755 1727204124.72252: checking for max_fail_percentage 12755 1727204124.72254: done checking for max_fail_percentage 12755 1727204124.72255: checking to see if all hosts have failed and the running result is not ok 12755 1727204124.72257: done checking to see if all hosts have failed 12755 1727204124.72257: getting the remaining hosts for this loop 12755 1727204124.72259: done getting the remaining hosts for this loop 12755 1727204124.72265: getting the next task for host managed-node1 12755 1727204124.72274: done getting next task for host managed-node1 12755 1727204124.72278: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12755 1727204124.72282: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204124.72412: getting variables 12755 1727204124.72414: in VariableManager get_vars() 12755 1727204124.72475: Calling all_inventory to load vars for managed-node1 12755 1727204124.72478: Calling groups_inventory to load vars for managed-node1 12755 1727204124.72481: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204124.72697: Calling all_plugins_play to load vars for managed-node1 12755 1727204124.72701: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204124.72706: Calling groups_plugins_play to load vars for managed-node1 12755 1727204124.73496: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000e0 12755 1727204124.73499: WORKER PROCESS EXITING 12755 1727204124.76908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204124.82236: done with get_vars() 12755 1727204124.82320: done getting variables 12755 1727204124.82420: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.140) 0:00:50.060 ***** 12755 1727204124.82467: entering _queue_task() for managed-node1/debug 12755 1727204124.82857: worker is 1 (out of 1 available) 12755 1727204124.82872: exiting _queue_task() for managed-node1/debug 12755 1727204124.82886: done queuing things up, now waiting for results queue to drain 12755 1727204124.82887: waiting for pending results... 12755 1727204124.83394: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12755 1727204124.83794: in run() - task 12b410aa-8751-72e9-1a19-0000000000e1 12755 1727204124.83838: variable 'ansible_search_path' from source: unknown 12755 1727204124.83847: variable 'ansible_search_path' from source: unknown 12755 1727204124.83893: calling self._execute() 12755 1727204124.84016: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204124.84031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204124.84047: variable 'omit' from source: magic vars 12755 1727204124.84477: variable 'ansible_distribution_major_version' from source: facts 12755 1727204124.84500: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204124.84511: variable 'omit' from source: magic vars 12755 1727204124.84586: variable 'omit' from source: magic vars 12755 1727204124.84636: variable 'omit' from source: magic vars 12755 1727204124.84685: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204124.84732: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204124.84760: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204124.84785: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204124.84808: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204124.84848: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204124.84857: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204124.84865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204124.84993: Set connection var ansible_connection to ssh 12755 1727204124.85007: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204124.85015: Set connection var ansible_shell_type to sh 12755 1727204124.85034: Set connection var ansible_timeout to 10 12755 1727204124.85045: Set connection var ansible_shell_executable to /bin/sh 12755 1727204124.85055: Set connection var ansible_pipelining to False 12755 1727204124.85085: variable 'ansible_shell_executable' from source: unknown 12755 1727204124.85096: variable 'ansible_connection' from source: unknown 12755 1727204124.85105: variable 'ansible_module_compression' from source: unknown 12755 1727204124.85113: variable 'ansible_shell_type' from source: unknown 12755 1727204124.85120: variable 'ansible_shell_executable' from source: unknown 12755 1727204124.85127: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204124.85135: variable 'ansible_pipelining' from source: unknown 12755 1727204124.85143: variable 'ansible_timeout' from source: unknown 12755 1727204124.85152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204124.85317: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204124.85396: variable 'omit' from source: magic vars 12755 1727204124.85399: starting attempt loop 12755 1727204124.85402: running the handler 12755 1727204124.85504: variable '__network_connections_result' from source: set_fact 12755 1727204124.85588: handler run complete 12755 1727204124.85619: attempt loop complete, returning result 12755 1727204124.85628: _execute() done 12755 1727204124.85636: dumping result to json 12755 1727204124.85644: done dumping result, returning 12755 1727204124.85659: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-72e9-1a19-0000000000e1] 12755 1727204124.85668: sending task result for task 12b410aa-8751-72e9-1a19-0000000000e1 ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 03d70ce0-ddeb-47d0-bf95-c6e32f95cb44", "[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c", "[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 03d70ce0-ddeb-47d0-bf95-c6e32f95cb44 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38 (not-active)" ] } 12755 1727204124.85858: no more pending results, returning what we have 12755 1727204124.85861: results queue empty 12755 1727204124.85863: checking for any_errors_fatal 12755 1727204124.85876: done checking for any_errors_fatal 12755 1727204124.85877: checking for max_fail_percentage 12755 1727204124.85879: done checking for max_fail_percentage 12755 1727204124.85880: checking to see if all hosts have failed and the running result is not ok 12755 1727204124.85881: done checking to see if all hosts have failed 12755 1727204124.85882: getting the remaining hosts for this loop 12755 1727204124.85884: done getting the remaining hosts for this loop 12755 1727204124.85888: getting the next task for host managed-node1 12755 1727204124.85898: done getting next task for host managed-node1 12755 1727204124.85903: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12755 1727204124.85906: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204124.85925: getting variables 12755 1727204124.85927: in VariableManager get_vars() 12755 1727204124.86240: Calling all_inventory to load vars for managed-node1 12755 1727204124.86243: Calling groups_inventory to load vars for managed-node1 12755 1727204124.86246: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204124.86256: Calling all_plugins_play to load vars for managed-node1 12755 1727204124.86259: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204124.86262: Calling groups_plugins_play to load vars for managed-node1 12755 1727204124.87503: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000e1 12755 1727204124.87508: WORKER PROCESS EXITING 12755 1727204124.89723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204124.93058: done with get_vars() 12755 1727204124.93108: done getting variables 12755 1727204124.93193: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:55:24 -0400 (0:00:00.107) 0:00:50.168 ***** 12755 1727204124.93242: entering _queue_task() for managed-node1/debug 12755 1727204124.93671: worker is 1 (out of 1 available) 12755 1727204124.93693: exiting _queue_task() for managed-node1/debug 12755 1727204124.93706: done queuing things up, now waiting for results queue to drain 12755 1727204124.93708: waiting for pending results... 12755 1727204124.94051: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12755 1727204124.94204: in run() - task 12b410aa-8751-72e9-1a19-0000000000e2 12755 1727204124.94222: variable 'ansible_search_path' from source: unknown 12755 1727204124.94235: variable 'ansible_search_path' from source: unknown 12755 1727204124.94495: calling self._execute() 12755 1727204124.94500: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204124.94503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204124.94506: variable 'omit' from source: magic vars 12755 1727204124.94910: variable 'ansible_distribution_major_version' from source: facts 12755 1727204124.94926: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204124.94935: variable 'omit' from source: magic vars 12755 1727204124.95011: variable 'omit' from source: magic vars 12755 1727204124.95059: variable 'omit' from source: magic vars 12755 1727204124.95114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204124.95156: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204124.95181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204124.95204: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204124.95230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204124.95264: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204124.95268: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204124.95273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204124.95412: Set connection var ansible_connection to ssh 12755 1727204124.95422: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204124.95426: Set connection var ansible_shell_type to sh 12755 1727204124.95448: Set connection var ansible_timeout to 10 12755 1727204124.95456: Set connection var ansible_shell_executable to /bin/sh 12755 1727204124.95463: Set connection var ansible_pipelining to False 12755 1727204124.95496: variable 'ansible_shell_executable' from source: unknown 12755 1727204124.95499: variable 'ansible_connection' from source: unknown 12755 1727204124.95505: variable 'ansible_module_compression' from source: unknown 12755 1727204124.95507: variable 'ansible_shell_type' from source: unknown 12755 1727204124.95515: variable 'ansible_shell_executable' from source: unknown 12755 1727204124.95517: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204124.95524: variable 'ansible_pipelining' from source: unknown 12755 1727204124.95527: variable 'ansible_timeout' from source: unknown 12755 1727204124.95533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204124.95751: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204124.95772: variable 'omit' from source: magic vars 12755 1727204124.95778: starting attempt loop 12755 1727204124.95781: running the handler 12755 1727204124.96070: variable '__network_connections_result' from source: set_fact 12755 1727204124.96074: variable '__network_connections_result' from source: set_fact 12755 1727204124.96396: handler run complete 12755 1727204124.96399: attempt loop complete, returning result 12755 1727204124.96401: _execute() done 12755 1727204124.96403: dumping result to json 12755 1727204124.96404: done dumping result, returning 12755 1727204124.96406: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-72e9-1a19-0000000000e2] 12755 1727204124.96408: sending task result for task 12b410aa-8751-72e9-1a19-0000000000e2 12755 1727204124.96482: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000e2 12755 1727204124.96486: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 03d70ce0-ddeb-47d0-bf95-c6e32f95cb44\n[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c\n[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 03d70ce0-ddeb-47d0-bf95-c6e32f95cb44 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 03d70ce0-ddeb-47d0-bf95-c6e32f95cb44", "[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c", "[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 03d70ce0-ddeb-47d0-bf95-c6e32f95cb44 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b31f9f7b-aae6-41e7-b0f9-8f3978732a4c (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c3d9ed9d-1394-45f3-85cd-4954ed7e4e38 (not-active)" ] } } 12755 1727204124.96634: no more pending results, returning what we have 12755 1727204124.96638: results queue empty 12755 1727204124.96645: checking for any_errors_fatal 12755 1727204124.96652: done checking for any_errors_fatal 12755 1727204124.96653: checking for max_fail_percentage 12755 1727204124.96655: done checking for max_fail_percentage 12755 1727204124.96657: checking to see if all hosts have failed and the running result is not ok 12755 1727204124.96658: done checking to see if all hosts have failed 12755 1727204124.96659: getting the remaining hosts for this loop 12755 1727204124.96660: done getting the remaining hosts for this loop 12755 1727204124.96666: getting the next task for host managed-node1 12755 1727204124.96674: done getting next task for host managed-node1 12755 1727204124.96679: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12755 1727204124.96683: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204124.96906: getting variables 12755 1727204124.96911: in VariableManager get_vars() 12755 1727204124.96963: Calling all_inventory to load vars for managed-node1 12755 1727204124.96966: Calling groups_inventory to load vars for managed-node1 12755 1727204124.96969: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204124.96979: Calling all_plugins_play to load vars for managed-node1 12755 1727204124.96983: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204124.96987: Calling groups_plugins_play to load vars for managed-node1 12755 1727204124.99492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204125.02619: done with get_vars() 12755 1727204125.02664: done getting variables 12755 1727204125.02738: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:55:25 -0400 (0:00:00.095) 0:00:50.263 ***** 12755 1727204125.02783: entering _queue_task() for managed-node1/debug 12755 1727204125.03158: worker is 1 (out of 1 available) 12755 1727204125.03172: exiting _queue_task() for managed-node1/debug 12755 1727204125.03187: done queuing things up, now waiting for results queue to drain 12755 1727204125.03188: waiting for pending results... 12755 1727204125.03529: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12755 1727204125.03698: in run() - task 12b410aa-8751-72e9-1a19-0000000000e3 12755 1727204125.03717: variable 'ansible_search_path' from source: unknown 12755 1727204125.03721: variable 'ansible_search_path' from source: unknown 12755 1727204125.03768: calling self._execute() 12755 1727204125.03893: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204125.03901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204125.03916: variable 'omit' from source: magic vars 12755 1727204125.04390: variable 'ansible_distribution_major_version' from source: facts 12755 1727204125.04405: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204125.04568: variable 'network_state' from source: role '' defaults 12755 1727204125.04580: Evaluated conditional (network_state != {}): False 12755 1727204125.04584: when evaluation is False, skipping this task 12755 1727204125.04587: _execute() done 12755 1727204125.04593: dumping result to json 12755 1727204125.04602: done dumping result, returning 12755 1727204125.04621: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-72e9-1a19-0000000000e3] 12755 1727204125.04627: sending task result for task 12b410aa-8751-72e9-1a19-0000000000e3 12755 1727204125.04732: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000e3 12755 1727204125.04736: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 12755 1727204125.04791: no more pending results, returning what we have 12755 1727204125.04795: results queue empty 12755 1727204125.04796: checking for any_errors_fatal 12755 1727204125.04813: done checking for any_errors_fatal 12755 1727204125.04814: checking for max_fail_percentage 12755 1727204125.04817: done checking for max_fail_percentage 12755 1727204125.04818: checking to see if all hosts have failed and the running result is not ok 12755 1727204125.04820: done checking to see if all hosts have failed 12755 1727204125.04821: getting the remaining hosts for this loop 12755 1727204125.04822: done getting the remaining hosts for this loop 12755 1727204125.04828: getting the next task for host managed-node1 12755 1727204125.04836: done getting next task for host managed-node1 12755 1727204125.04841: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12755 1727204125.04845: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204125.04874: getting variables 12755 1727204125.04876: in VariableManager get_vars() 12755 1727204125.05052: Calling all_inventory to load vars for managed-node1 12755 1727204125.05056: Calling groups_inventory to load vars for managed-node1 12755 1727204125.05059: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204125.05073: Calling all_plugins_play to load vars for managed-node1 12755 1727204125.05077: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204125.05124: Calling groups_plugins_play to load vars for managed-node1 12755 1727204125.07579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204125.10632: done with get_vars() 12755 1727204125.10681: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:55:25 -0400 (0:00:00.080) 0:00:50.343 ***** 12755 1727204125.10810: entering _queue_task() for managed-node1/ping 12755 1727204125.11196: worker is 1 (out of 1 available) 12755 1727204125.11218: exiting _queue_task() for managed-node1/ping 12755 1727204125.11233: done queuing things up, now waiting for results queue to drain 12755 1727204125.11234: waiting for pending results... 12755 1727204125.11615: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 12755 1727204125.11693: in run() - task 12b410aa-8751-72e9-1a19-0000000000e4 12755 1727204125.11709: variable 'ansible_search_path' from source: unknown 12755 1727204125.11713: variable 'ansible_search_path' from source: unknown 12755 1727204125.11762: calling self._execute() 12755 1727204125.12038: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204125.12047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204125.12290: variable 'omit' from source: magic vars 12755 1727204125.13351: variable 'ansible_distribution_major_version' from source: facts 12755 1727204125.13362: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204125.13370: variable 'omit' from source: magic vars 12755 1727204125.13569: variable 'omit' from source: magic vars 12755 1727204125.13665: variable 'omit' from source: magic vars 12755 1727204125.13820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204125.13864: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204125.13894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204125.14012: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204125.14107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204125.14150: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204125.14154: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204125.14159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204125.14291: Set connection var ansible_connection to ssh 12755 1727204125.14504: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204125.14695: Set connection var ansible_shell_type to sh 12755 1727204125.14699: Set connection var ansible_timeout to 10 12755 1727204125.14702: Set connection var ansible_shell_executable to /bin/sh 12755 1727204125.14704: Set connection var ansible_pipelining to False 12755 1727204125.14706: variable 'ansible_shell_executable' from source: unknown 12755 1727204125.14709: variable 'ansible_connection' from source: unknown 12755 1727204125.14712: variable 'ansible_module_compression' from source: unknown 12755 1727204125.14714: variable 'ansible_shell_type' from source: unknown 12755 1727204125.14716: variable 'ansible_shell_executable' from source: unknown 12755 1727204125.14718: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204125.14721: variable 'ansible_pipelining' from source: unknown 12755 1727204125.14723: variable 'ansible_timeout' from source: unknown 12755 1727204125.14725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204125.15171: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204125.15185: variable 'omit' from source: magic vars 12755 1727204125.15195: starting attempt loop 12755 1727204125.15390: running the handler 12755 1727204125.15407: _low_level_execute_command(): starting 12755 1727204125.15419: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204125.16809: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204125.16812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204125.16986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204125.16997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204125.17067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204125.17074: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204125.17305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204125.19218: stdout chunk (state=3): >>>/root <<< 12755 1727204125.19491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204125.19502: stderr chunk (state=3): >>><<< 12755 1727204125.19508: stdout chunk (state=3): >>><<< 12755 1727204125.19543: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204125.19557: _low_level_execute_command(): starting 12755 1727204125.19565: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204125.1954198-15498-136043380167750 `" && echo ansible-tmp-1727204125.1954198-15498-136043380167750="` echo /root/.ansible/tmp/ansible-tmp-1727204125.1954198-15498-136043380167750 `" ) && sleep 0' 12755 1727204125.20530: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204125.20541: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204125.20553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204125.20569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204125.20582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204125.20591: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204125.20602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204125.20626: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12755 1727204125.20635: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 12755 1727204125.20643: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12755 1727204125.20653: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204125.20664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204125.20678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204125.20735: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204125.20775: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204125.20837: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204125.20841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204125.20888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204125.23280: stdout chunk (state=3): >>>ansible-tmp-1727204125.1954198-15498-136043380167750=/root/.ansible/tmp/ansible-tmp-1727204125.1954198-15498-136043380167750 <<< 12755 1727204125.23300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204125.23998: stderr chunk (state=3): >>><<< 12755 1727204125.24002: stdout chunk (state=3): >>><<< 12755 1727204125.24006: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204125.1954198-15498-136043380167750=/root/.ansible/tmp/ansible-tmp-1727204125.1954198-15498-136043380167750 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204125.24011: variable 'ansible_module_compression' from source: unknown 12755 1727204125.24014: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 12755 1727204125.24016: variable 'ansible_facts' from source: unknown 12755 1727204125.24158: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204125.1954198-15498-136043380167750/AnsiballZ_ping.py 12755 1727204125.24629: Sending initial data 12755 1727204125.24663: Sent initial data (153 bytes) 12755 1727204125.25817: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204125.25974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204125.26080: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204125.26200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204125.26284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204125.28174: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 12755 1727204125.28274: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204125.28352: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204125.28356: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204125.1954198-15498-136043380167750/AnsiballZ_ping.py" <<< 12755 1727204125.28365: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpq3kswa3m /root/.ansible/tmp/ansible-tmp-1727204125.1954198-15498-136043380167750/AnsiballZ_ping.py <<< 12755 1727204125.28488: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpq3kswa3m" to remote "/root/.ansible/tmp/ansible-tmp-1727204125.1954198-15498-136043380167750/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204125.1954198-15498-136043380167750/AnsiballZ_ping.py" <<< 12755 1727204125.30495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204125.30499: stdout chunk (state=3): >>><<< 12755 1727204125.30501: stderr chunk (state=3): >>><<< 12755 1727204125.30504: done transferring module to remote 12755 1727204125.30506: _low_level_execute_command(): starting 12755 1727204125.30508: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204125.1954198-15498-136043380167750/ /root/.ansible/tmp/ansible-tmp-1727204125.1954198-15498-136043380167750/AnsiballZ_ping.py && sleep 0' 12755 1727204125.31273: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204125.31301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204125.31323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204125.31342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204125.31358: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204125.31414: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204125.31471: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204125.31487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204125.31520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204125.31639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204125.33825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204125.33848: stderr chunk (state=3): >>><<< 12755 1727204125.33921: stdout chunk (state=3): >>><<< 12755 1727204125.33974: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204125.33978: _low_level_execute_command(): starting 12755 1727204125.33982: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204125.1954198-15498-136043380167750/AnsiballZ_ping.py && sleep 0' 12755 1727204125.34723: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204125.34727: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204125.34730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204125.34732: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204125.34808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204125.52645: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 12755 1727204125.54241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204125.54245: stdout chunk (state=3): >>><<< 12755 1727204125.54247: stderr chunk (state=3): >>><<< 12755 1727204125.54270: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204125.54311: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204125.1954198-15498-136043380167750/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204125.54333: _low_level_execute_command(): starting 12755 1727204125.54395: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204125.1954198-15498-136043380167750/ > /dev/null 2>&1 && sleep 0' 12755 1727204125.55028: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204125.55051: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204125.55066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204125.55088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204125.55112: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204125.55126: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204125.55141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204125.55279: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204125.55317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204125.57374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204125.57378: stdout chunk (state=3): >>><<< 12755 1727204125.57381: stderr chunk (state=3): >>><<< 12755 1727204125.57414: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204125.57433: handler run complete 12755 1727204125.57459: attempt loop complete, returning result 12755 1727204125.57467: _execute() done 12755 1727204125.57474: dumping result to json 12755 1727204125.57483: done dumping result, returning 12755 1727204125.57597: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-72e9-1a19-0000000000e4] 12755 1727204125.57602: sending task result for task 12b410aa-8751-72e9-1a19-0000000000e4 12755 1727204125.57673: done sending task result for task 12b410aa-8751-72e9-1a19-0000000000e4 12755 1727204125.57677: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 12755 1727204125.57759: no more pending results, returning what we have 12755 1727204125.57763: results queue empty 12755 1727204125.57764: checking for any_errors_fatal 12755 1727204125.57771: done checking for any_errors_fatal 12755 1727204125.57772: checking for max_fail_percentage 12755 1727204125.57773: done checking for max_fail_percentage 12755 1727204125.57775: checking to see if all hosts have failed and the running result is not ok 12755 1727204125.57776: done checking to see if all hosts have failed 12755 1727204125.57777: getting the remaining hosts for this loop 12755 1727204125.57779: done getting the remaining hosts for this loop 12755 1727204125.57784: getting the next task for host managed-node1 12755 1727204125.57904: done getting next task for host managed-node1 12755 1727204125.57908: ^ task is: TASK: meta (role_complete) 12755 1727204125.57914: ^ state is: HOST STATE: block=2, task=28, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204125.57931: getting variables 12755 1727204125.57933: in VariableManager get_vars() 12755 1727204125.58105: Calling all_inventory to load vars for managed-node1 12755 1727204125.58111: Calling groups_inventory to load vars for managed-node1 12755 1727204125.58114: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204125.58126: Calling all_plugins_play to load vars for managed-node1 12755 1727204125.58131: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204125.58136: Calling groups_plugins_play to load vars for managed-node1 12755 1727204125.60595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204125.63564: done with get_vars() 12755 1727204125.63606: done getting variables 12755 1727204125.63712: done queuing things up, now waiting for results queue to drain 12755 1727204125.63715: results queue empty 12755 1727204125.63716: checking for any_errors_fatal 12755 1727204125.63720: done checking for any_errors_fatal 12755 1727204125.63721: checking for max_fail_percentage 12755 1727204125.63723: done checking for max_fail_percentage 12755 1727204125.63724: checking to see if all hosts have failed and the running result is not ok 12755 1727204125.63725: done checking to see if all hosts have failed 12755 1727204125.63726: getting the remaining hosts for this loop 12755 1727204125.63727: done getting the remaining hosts for this loop 12755 1727204125.63731: getting the next task for host managed-node1 12755 1727204125.63738: done getting next task for host managed-node1 12755 1727204125.63742: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12755 1727204125.63745: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204125.63759: getting variables 12755 1727204125.63761: in VariableManager get_vars() 12755 1727204125.63793: Calling all_inventory to load vars for managed-node1 12755 1727204125.63797: Calling groups_inventory to load vars for managed-node1 12755 1727204125.63799: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204125.63806: Calling all_plugins_play to load vars for managed-node1 12755 1727204125.63809: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204125.63813: Calling groups_plugins_play to load vars for managed-node1 12755 1727204125.65902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204125.68375: done with get_vars() 12755 1727204125.68405: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:55:25 -0400 (0:00:00.576) 0:00:50.920 ***** 12755 1727204125.68477: entering _queue_task() for managed-node1/include_tasks 12755 1727204125.68747: worker is 1 (out of 1 available) 12755 1727204125.68764: exiting _queue_task() for managed-node1/include_tasks 12755 1727204125.68778: done queuing things up, now waiting for results queue to drain 12755 1727204125.68780: waiting for pending results... 12755 1727204125.68996: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12755 1727204125.69296: in run() - task 12b410aa-8751-72e9-1a19-00000000011b 12755 1727204125.69301: variable 'ansible_search_path' from source: unknown 12755 1727204125.69304: variable 'ansible_search_path' from source: unknown 12755 1727204125.69307: calling self._execute() 12755 1727204125.69309: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204125.69312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204125.69314: variable 'omit' from source: magic vars 12755 1727204125.69662: variable 'ansible_distribution_major_version' from source: facts 12755 1727204125.69673: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204125.69680: _execute() done 12755 1727204125.69683: dumping result to json 12755 1727204125.69688: done dumping result, returning 12755 1727204125.69700: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-72e9-1a19-00000000011b] 12755 1727204125.69706: sending task result for task 12b410aa-8751-72e9-1a19-00000000011b 12755 1727204125.69807: done sending task result for task 12b410aa-8751-72e9-1a19-00000000011b 12755 1727204125.69811: WORKER PROCESS EXITING 12755 1727204125.69863: no more pending results, returning what we have 12755 1727204125.69869: in VariableManager get_vars() 12755 1727204125.69942: Calling all_inventory to load vars for managed-node1 12755 1727204125.69946: Calling groups_inventory to load vars for managed-node1 12755 1727204125.69949: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204125.69960: Calling all_plugins_play to load vars for managed-node1 12755 1727204125.69963: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204125.69966: Calling groups_plugins_play to load vars for managed-node1 12755 1727204125.71458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204125.73019: done with get_vars() 12755 1727204125.73041: variable 'ansible_search_path' from source: unknown 12755 1727204125.73043: variable 'ansible_search_path' from source: unknown 12755 1727204125.73075: we have included files to process 12755 1727204125.73076: generating all_blocks data 12755 1727204125.73078: done generating all_blocks data 12755 1727204125.73084: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12755 1727204125.73085: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12755 1727204125.73087: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12755 1727204125.73581: done processing included file 12755 1727204125.73583: iterating over new_blocks loaded from include file 12755 1727204125.73584: in VariableManager get_vars() 12755 1727204125.73615: done with get_vars() 12755 1727204125.73616: filtering new block on tags 12755 1727204125.73631: done filtering new block on tags 12755 1727204125.73633: in VariableManager get_vars() 12755 1727204125.73656: done with get_vars() 12755 1727204125.73657: filtering new block on tags 12755 1727204125.73675: done filtering new block on tags 12755 1727204125.73677: in VariableManager get_vars() 12755 1727204125.73703: done with get_vars() 12755 1727204125.73705: filtering new block on tags 12755 1727204125.73720: done filtering new block on tags 12755 1727204125.73722: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 12755 1727204125.73726: extending task lists for all hosts with included blocks 12755 1727204125.74360: done extending task lists 12755 1727204125.74361: done processing included files 12755 1727204125.74362: results queue empty 12755 1727204125.74363: checking for any_errors_fatal 12755 1727204125.74364: done checking for any_errors_fatal 12755 1727204125.74365: checking for max_fail_percentage 12755 1727204125.74366: done checking for max_fail_percentage 12755 1727204125.74366: checking to see if all hosts have failed and the running result is not ok 12755 1727204125.74367: done checking to see if all hosts have failed 12755 1727204125.74367: getting the remaining hosts for this loop 12755 1727204125.74368: done getting the remaining hosts for this loop 12755 1727204125.74370: getting the next task for host managed-node1 12755 1727204125.74373: done getting next task for host managed-node1 12755 1727204125.74375: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12755 1727204125.74378: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204125.74386: getting variables 12755 1727204125.74387: in VariableManager get_vars() 12755 1727204125.74405: Calling all_inventory to load vars for managed-node1 12755 1727204125.74407: Calling groups_inventory to load vars for managed-node1 12755 1727204125.74409: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204125.74415: Calling all_plugins_play to load vars for managed-node1 12755 1727204125.74417: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204125.74419: Calling groups_plugins_play to load vars for managed-node1 12755 1727204125.75612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204125.77594: done with get_vars() 12755 1727204125.77626: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:55:25 -0400 (0:00:00.092) 0:00:51.012 ***** 12755 1727204125.77709: entering _queue_task() for managed-node1/setup 12755 1727204125.78061: worker is 1 (out of 1 available) 12755 1727204125.78075: exiting _queue_task() for managed-node1/setup 12755 1727204125.78088: done queuing things up, now waiting for results queue to drain 12755 1727204125.78092: waiting for pending results... 12755 1727204125.78514: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12755 1727204125.78578: in run() - task 12b410aa-8751-72e9-1a19-00000000084f 12755 1727204125.78592: variable 'ansible_search_path' from source: unknown 12755 1727204125.78596: variable 'ansible_search_path' from source: unknown 12755 1727204125.78633: calling self._execute() 12755 1727204125.78727: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204125.78733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204125.78747: variable 'omit' from source: magic vars 12755 1727204125.79091: variable 'ansible_distribution_major_version' from source: facts 12755 1727204125.79103: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204125.79325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204125.81045: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204125.81101: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204125.81317: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204125.81347: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204125.81514: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204125.81518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204125.81521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204125.81541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204125.81594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204125.81616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204125.81682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204125.81713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204125.81749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204125.81799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204125.81821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204125.82009: variable '__network_required_facts' from source: role '' defaults 12755 1727204125.82023: variable 'ansible_facts' from source: unknown 12755 1727204125.82885: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 12755 1727204125.82892: when evaluation is False, skipping this task 12755 1727204125.82895: _execute() done 12755 1727204125.82898: dumping result to json 12755 1727204125.82901: done dumping result, returning 12755 1727204125.82911: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-72e9-1a19-00000000084f] 12755 1727204125.82918: sending task result for task 12b410aa-8751-72e9-1a19-00000000084f 12755 1727204125.83013: done sending task result for task 12b410aa-8751-72e9-1a19-00000000084f 12755 1727204125.83016: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204125.83079: no more pending results, returning what we have 12755 1727204125.83083: results queue empty 12755 1727204125.83084: checking for any_errors_fatal 12755 1727204125.83086: done checking for any_errors_fatal 12755 1727204125.83087: checking for max_fail_percentage 12755 1727204125.83090: done checking for max_fail_percentage 12755 1727204125.83091: checking to see if all hosts have failed and the running result is not ok 12755 1727204125.83093: done checking to see if all hosts have failed 12755 1727204125.83093: getting the remaining hosts for this loop 12755 1727204125.83095: done getting the remaining hosts for this loop 12755 1727204125.83100: getting the next task for host managed-node1 12755 1727204125.83111: done getting next task for host managed-node1 12755 1727204125.83115: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 12755 1727204125.83120: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204125.83144: getting variables 12755 1727204125.83146: in VariableManager get_vars() 12755 1727204125.83207: Calling all_inventory to load vars for managed-node1 12755 1727204125.83210: Calling groups_inventory to load vars for managed-node1 12755 1727204125.83213: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204125.83227: Calling all_plugins_play to load vars for managed-node1 12755 1727204125.83230: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204125.83234: Calling groups_plugins_play to load vars for managed-node1 12755 1727204125.84615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204125.87627: done with get_vars() 12755 1727204125.87672: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:55:25 -0400 (0:00:00.100) 0:00:51.113 ***** 12755 1727204125.87818: entering _queue_task() for managed-node1/stat 12755 1727204125.88206: worker is 1 (out of 1 available) 12755 1727204125.88222: exiting _queue_task() for managed-node1/stat 12755 1727204125.88237: done queuing things up, now waiting for results queue to drain 12755 1727204125.88239: waiting for pending results... 12755 1727204125.88611: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 12755 1727204125.88753: in run() - task 12b410aa-8751-72e9-1a19-000000000851 12755 1727204125.88770: variable 'ansible_search_path' from source: unknown 12755 1727204125.88774: variable 'ansible_search_path' from source: unknown 12755 1727204125.88897: calling self._execute() 12755 1727204125.88935: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204125.88943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204125.88962: variable 'omit' from source: magic vars 12755 1727204125.89409: variable 'ansible_distribution_major_version' from source: facts 12755 1727204125.89426: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204125.89653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204125.89984: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204125.90051: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204125.90097: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204125.90140: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204125.90295: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204125.90299: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204125.90330: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204125.90366: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204125.90475: variable '__network_is_ostree' from source: set_fact 12755 1727204125.90492: Evaluated conditional (not __network_is_ostree is defined): False 12755 1727204125.90496: when evaluation is False, skipping this task 12755 1727204125.90499: _execute() done 12755 1727204125.90694: dumping result to json 12755 1727204125.90698: done dumping result, returning 12755 1727204125.90700: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-72e9-1a19-000000000851] 12755 1727204125.90709: sending task result for task 12b410aa-8751-72e9-1a19-000000000851 12755 1727204125.90780: done sending task result for task 12b410aa-8751-72e9-1a19-000000000851 12755 1727204125.90782: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12755 1727204125.90866: no more pending results, returning what we have 12755 1727204125.90872: results queue empty 12755 1727204125.90873: checking for any_errors_fatal 12755 1727204125.90879: done checking for any_errors_fatal 12755 1727204125.90880: checking for max_fail_percentage 12755 1727204125.90882: done checking for max_fail_percentage 12755 1727204125.90883: checking to see if all hosts have failed and the running result is not ok 12755 1727204125.90884: done checking to see if all hosts have failed 12755 1727204125.90886: getting the remaining hosts for this loop 12755 1727204125.90887: done getting the remaining hosts for this loop 12755 1727204125.90894: getting the next task for host managed-node1 12755 1727204125.90903: done getting next task for host managed-node1 12755 1727204125.90907: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12755 1727204125.90914: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204125.90936: getting variables 12755 1727204125.90938: in VariableManager get_vars() 12755 1727204125.91105: Calling all_inventory to load vars for managed-node1 12755 1727204125.91112: Calling groups_inventory to load vars for managed-node1 12755 1727204125.91115: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204125.91128: Calling all_plugins_play to load vars for managed-node1 12755 1727204125.91131: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204125.91135: Calling groups_plugins_play to load vars for managed-node1 12755 1727204125.93357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204125.97814: done with get_vars() 12755 1727204125.97850: done getting variables 12755 1727204125.97919: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:55:25 -0400 (0:00:00.101) 0:00:51.215 ***** 12755 1727204125.97968: entering _queue_task() for managed-node1/set_fact 12755 1727204125.98759: worker is 1 (out of 1 available) 12755 1727204125.98773: exiting _queue_task() for managed-node1/set_fact 12755 1727204125.98992: done queuing things up, now waiting for results queue to drain 12755 1727204125.98995: waiting for pending results... 12755 1727204125.99396: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12755 1727204125.99829: in run() - task 12b410aa-8751-72e9-1a19-000000000852 12755 1727204125.99845: variable 'ansible_search_path' from source: unknown 12755 1727204125.99853: variable 'ansible_search_path' from source: unknown 12755 1727204125.99936: calling self._execute() 12755 1727204126.00182: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204126.00187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204126.00192: variable 'omit' from source: magic vars 12755 1727204126.00995: variable 'ansible_distribution_major_version' from source: facts 12755 1727204126.00998: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204126.01051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204126.01365: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204126.01431: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204126.01476: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204126.01521: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204126.01616: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204126.01658: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204126.01691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204126.01727: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204126.01994: variable '__network_is_ostree' from source: set_fact 12755 1727204126.01997: Evaluated conditional (not __network_is_ostree is defined): False 12755 1727204126.01999: when evaluation is False, skipping this task 12755 1727204126.02001: _execute() done 12755 1727204126.02003: dumping result to json 12755 1727204126.02005: done dumping result, returning 12755 1727204126.02007: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-72e9-1a19-000000000852] 12755 1727204126.02011: sending task result for task 12b410aa-8751-72e9-1a19-000000000852 12755 1727204126.02079: done sending task result for task 12b410aa-8751-72e9-1a19-000000000852 12755 1727204126.02083: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12755 1727204126.02234: no more pending results, returning what we have 12755 1727204126.02238: results queue empty 12755 1727204126.02239: checking for any_errors_fatal 12755 1727204126.02244: done checking for any_errors_fatal 12755 1727204126.02245: checking for max_fail_percentage 12755 1727204126.02248: done checking for max_fail_percentage 12755 1727204126.02249: checking to see if all hosts have failed and the running result is not ok 12755 1727204126.02250: done checking to see if all hosts have failed 12755 1727204126.02251: getting the remaining hosts for this loop 12755 1727204126.02253: done getting the remaining hosts for this loop 12755 1727204126.02257: getting the next task for host managed-node1 12755 1727204126.02266: done getting next task for host managed-node1 12755 1727204126.02271: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 12755 1727204126.02276: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204126.02297: getting variables 12755 1727204126.02299: in VariableManager get_vars() 12755 1727204126.02355: Calling all_inventory to load vars for managed-node1 12755 1727204126.02359: Calling groups_inventory to load vars for managed-node1 12755 1727204126.02362: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204126.02373: Calling all_plugins_play to load vars for managed-node1 12755 1727204126.02377: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204126.02381: Calling groups_plugins_play to load vars for managed-node1 12755 1727204126.04517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204126.07514: done with get_vars() 12755 1727204126.07559: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:55:26 -0400 (0:00:00.097) 0:00:51.312 ***** 12755 1727204126.07675: entering _queue_task() for managed-node1/service_facts 12755 1727204126.08042: worker is 1 (out of 1 available) 12755 1727204126.08057: exiting _queue_task() for managed-node1/service_facts 12755 1727204126.08072: done queuing things up, now waiting for results queue to drain 12755 1727204126.08074: waiting for pending results... 12755 1727204126.08433: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 12755 1727204126.08621: in run() - task 12b410aa-8751-72e9-1a19-000000000854 12755 1727204126.08678: variable 'ansible_search_path' from source: unknown 12755 1727204126.08682: variable 'ansible_search_path' from source: unknown 12755 1727204126.08693: calling self._execute() 12755 1727204126.08914: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204126.08917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204126.08920: variable 'omit' from source: magic vars 12755 1727204126.09307: variable 'ansible_distribution_major_version' from source: facts 12755 1727204126.09322: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204126.09329: variable 'omit' from source: magic vars 12755 1727204126.09431: variable 'omit' from source: magic vars 12755 1727204126.09478: variable 'omit' from source: magic vars 12755 1727204126.09529: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204126.09576: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204126.09602: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204126.09625: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204126.09638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204126.09679: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204126.09683: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204126.09688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204126.09816: Set connection var ansible_connection to ssh 12755 1727204126.09825: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204126.09828: Set connection var ansible_shell_type to sh 12755 1727204126.09845: Set connection var ansible_timeout to 10 12755 1727204126.09852: Set connection var ansible_shell_executable to /bin/sh 12755 1727204126.09860: Set connection var ansible_pipelining to False 12755 1727204126.09891: variable 'ansible_shell_executable' from source: unknown 12755 1727204126.09899: variable 'ansible_connection' from source: unknown 12755 1727204126.09902: variable 'ansible_module_compression' from source: unknown 12755 1727204126.09905: variable 'ansible_shell_type' from source: unknown 12755 1727204126.09907: variable 'ansible_shell_executable' from source: unknown 12755 1727204126.09914: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204126.09919: variable 'ansible_pipelining' from source: unknown 12755 1727204126.09924: variable 'ansible_timeout' from source: unknown 12755 1727204126.09930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204126.10224: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204126.10228: variable 'omit' from source: magic vars 12755 1727204126.10231: starting attempt loop 12755 1727204126.10233: running the handler 12755 1727204126.10236: _low_level_execute_command(): starting 12755 1727204126.10238: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204126.11063: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204126.11077: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204126.11104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204126.11183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204126.13160: stdout chunk (state=3): >>>/root <<< 12755 1727204126.13495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204126.13498: stdout chunk (state=3): >>><<< 12755 1727204126.13501: stderr chunk (state=3): >>><<< 12755 1727204126.13505: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204126.13512: _low_level_execute_command(): starting 12755 1727204126.13516: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204126.1341414-15607-129462599026185 `" && echo ansible-tmp-1727204126.1341414-15607-129462599026185="` echo /root/.ansible/tmp/ansible-tmp-1727204126.1341414-15607-129462599026185 `" ) && sleep 0' 12755 1727204126.14033: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204126.14043: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204126.14055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204126.14071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204126.14085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204126.14097: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204126.14195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 12755 1727204126.14213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204126.14247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204126.14299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204126.16440: stdout chunk (state=3): >>>ansible-tmp-1727204126.1341414-15607-129462599026185=/root/.ansible/tmp/ansible-tmp-1727204126.1341414-15607-129462599026185 <<< 12755 1727204126.16629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204126.16633: stderr chunk (state=3): >>><<< 12755 1727204126.16635: stdout chunk (state=3): >>><<< 12755 1727204126.16795: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204126.1341414-15607-129462599026185=/root/.ansible/tmp/ansible-tmp-1727204126.1341414-15607-129462599026185 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204126.16799: variable 'ansible_module_compression' from source: unknown 12755 1727204126.16802: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 12755 1727204126.16820: variable 'ansible_facts' from source: unknown 12755 1727204126.16932: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204126.1341414-15607-129462599026185/AnsiballZ_service_facts.py 12755 1727204126.17233: Sending initial data 12755 1727204126.17236: Sent initial data (162 bytes) 12755 1727204126.17866: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204126.17995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204126.18001: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204126.18226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204126.18229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204126.18399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204126.20127: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204126.20156: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204126.20207: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpfx05snr8 /root/.ansible/tmp/ansible-tmp-1727204126.1341414-15607-129462599026185/AnsiballZ_service_facts.py <<< 12755 1727204126.20232: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204126.1341414-15607-129462599026185/AnsiballZ_service_facts.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpfx05snr8" to remote "/root/.ansible/tmp/ansible-tmp-1727204126.1341414-15607-129462599026185/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204126.1341414-15607-129462599026185/AnsiballZ_service_facts.py" <<< 12755 1727204126.22512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204126.22577: stderr chunk (state=3): >>><<< 12755 1727204126.22699: stdout chunk (state=3): >>><<< 12755 1727204126.22702: done transferring module to remote 12755 1727204126.22718: _low_level_execute_command(): starting 12755 1727204126.22728: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204126.1341414-15607-129462599026185/ /root/.ansible/tmp/ansible-tmp-1727204126.1341414-15607-129462599026185/AnsiballZ_service_facts.py && sleep 0' 12755 1727204126.23815: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204126.23955: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204126.24009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204126.24178: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204126.24258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204126.26388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204126.26482: stderr chunk (state=3): >>><<< 12755 1727204126.26618: stdout chunk (state=3): >>><<< 12755 1727204126.26643: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204126.26658: _low_level_execute_command(): starting 12755 1727204126.26785: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204126.1341414-15607-129462599026185/AnsiballZ_service_facts.py && sleep 0' 12755 1727204126.27768: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204126.27772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204126.27774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12755 1727204126.27777: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204126.28138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204126.28214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204126.28417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204128.46053: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name"<<< 12755 1727204128.46253: stdout chunk (state=3): >>>: "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 12755 1727204128.48195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204128.48199: stdout chunk (state=3): >>><<< 12755 1727204128.48203: stderr chunk (state=3): >>><<< 12755 1727204128.48207: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204128.49651: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204126.1341414-15607-129462599026185/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204128.49656: _low_level_execute_command(): starting 12755 1727204128.49659: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204126.1341414-15607-129462599026185/ > /dev/null 2>&1 && sleep 0' 12755 1727204128.50305: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204128.50311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204128.50315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204128.50324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204128.50336: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204128.50345: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204128.50358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204128.50387: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12755 1727204128.50393: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 12755 1727204128.50396: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12755 1727204128.50406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204128.50419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204128.50447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204128.50450: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204128.50452: stderr chunk (state=3): >>>debug2: match found <<< 12755 1727204128.50457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204128.50533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204128.50550: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204128.50574: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204128.50724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204128.53017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204128.53021: stdout chunk (state=3): >>><<< 12755 1727204128.53029: stderr chunk (state=3): >>><<< 12755 1727204128.53052: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204128.53060: handler run complete 12755 1727204128.53359: variable 'ansible_facts' from source: unknown 12755 1727204128.53643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204128.54668: variable 'ansible_facts' from source: unknown 12755 1727204128.54916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204128.55304: attempt loop complete, returning result 12755 1727204128.55316: _execute() done 12755 1727204128.55324: dumping result to json 12755 1727204128.55409: done dumping result, returning 12755 1727204128.55444: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-72e9-1a19-000000000854] 12755 1727204128.55494: sending task result for task 12b410aa-8751-72e9-1a19-000000000854 12755 1727204128.57396: done sending task result for task 12b410aa-8751-72e9-1a19-000000000854 12755 1727204128.57400: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204128.57524: no more pending results, returning what we have 12755 1727204128.57527: results queue empty 12755 1727204128.57528: checking for any_errors_fatal 12755 1727204128.57532: done checking for any_errors_fatal 12755 1727204128.57533: checking for max_fail_percentage 12755 1727204128.57534: done checking for max_fail_percentage 12755 1727204128.57535: checking to see if all hosts have failed and the running result is not ok 12755 1727204128.57536: done checking to see if all hosts have failed 12755 1727204128.57537: getting the remaining hosts for this loop 12755 1727204128.57539: done getting the remaining hosts for this loop 12755 1727204128.57543: getting the next task for host managed-node1 12755 1727204128.57550: done getting next task for host managed-node1 12755 1727204128.57553: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 12755 1727204128.57558: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204128.57572: getting variables 12755 1727204128.57574: in VariableManager get_vars() 12755 1727204128.57731: Calling all_inventory to load vars for managed-node1 12755 1727204128.57735: Calling groups_inventory to load vars for managed-node1 12755 1727204128.57738: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204128.57749: Calling all_plugins_play to load vars for managed-node1 12755 1727204128.57752: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204128.57760: Calling groups_plugins_play to load vars for managed-node1 12755 1727204128.62393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204128.67107: done with get_vars() 12755 1727204128.67164: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:55:28 -0400 (0:00:02.596) 0:00:53.909 ***** 12755 1727204128.67337: entering _queue_task() for managed-node1/package_facts 12755 1727204128.67786: worker is 1 (out of 1 available) 12755 1727204128.67832: exiting _queue_task() for managed-node1/package_facts 12755 1727204128.67847: done queuing things up, now waiting for results queue to drain 12755 1727204128.67849: waiting for pending results... 12755 1727204128.68208: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 12755 1727204128.68297: in run() - task 12b410aa-8751-72e9-1a19-000000000855 12755 1727204128.68325: variable 'ansible_search_path' from source: unknown 12755 1727204128.68335: variable 'ansible_search_path' from source: unknown 12755 1727204128.68384: calling self._execute() 12755 1727204128.68505: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204128.68519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204128.68536: variable 'omit' from source: magic vars 12755 1727204128.68993: variable 'ansible_distribution_major_version' from source: facts 12755 1727204128.69016: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204128.69030: variable 'omit' from source: magic vars 12755 1727204128.69127: variable 'omit' from source: magic vars 12755 1727204128.69181: variable 'omit' from source: magic vars 12755 1727204128.69237: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204128.69284: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204128.69314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204128.69393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204128.69397: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204128.69399: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204128.69403: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204128.69411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204128.69535: Set connection var ansible_connection to ssh 12755 1727204128.69550: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204128.69558: Set connection var ansible_shell_type to sh 12755 1727204128.69577: Set connection var ansible_timeout to 10 12755 1727204128.69590: Set connection var ansible_shell_executable to /bin/sh 12755 1727204128.69603: Set connection var ansible_pipelining to False 12755 1727204128.69631: variable 'ansible_shell_executable' from source: unknown 12755 1727204128.69697: variable 'ansible_connection' from source: unknown 12755 1727204128.69700: variable 'ansible_module_compression' from source: unknown 12755 1727204128.69703: variable 'ansible_shell_type' from source: unknown 12755 1727204128.69705: variable 'ansible_shell_executable' from source: unknown 12755 1727204128.69707: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204128.69709: variable 'ansible_pipelining' from source: unknown 12755 1727204128.69711: variable 'ansible_timeout' from source: unknown 12755 1727204128.69713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204128.69920: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204128.69938: variable 'omit' from source: magic vars 12755 1727204128.69951: starting attempt loop 12755 1727204128.69958: running the handler 12755 1727204128.69978: _low_level_execute_command(): starting 12755 1727204128.69993: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204128.70795: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204128.70816: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204128.70877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204128.70907: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204128.70934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204128.71059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204128.72976: stdout chunk (state=3): >>>/root <<< 12755 1727204128.73395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204128.73399: stdout chunk (state=3): >>><<< 12755 1727204128.73402: stderr chunk (state=3): >>><<< 12755 1727204128.73405: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204128.73407: _low_level_execute_command(): starting 12755 1727204128.73410: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204128.732297-15740-179517321363041 `" && echo ansible-tmp-1727204128.732297-15740-179517321363041="` echo /root/.ansible/tmp/ansible-tmp-1727204128.732297-15740-179517321363041 `" ) && sleep 0' 12755 1727204128.74126: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204128.74136: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204128.74230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204128.74258: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204128.74270: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204128.74292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204128.74364: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204128.76573: stdout chunk (state=3): >>>ansible-tmp-1727204128.732297-15740-179517321363041=/root/.ansible/tmp/ansible-tmp-1727204128.732297-15740-179517321363041 <<< 12755 1727204128.76896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204128.76900: stdout chunk (state=3): >>><<< 12755 1727204128.76903: stderr chunk (state=3): >>><<< 12755 1727204128.76905: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204128.732297-15740-179517321363041=/root/.ansible/tmp/ansible-tmp-1727204128.732297-15740-179517321363041 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204128.76908: variable 'ansible_module_compression' from source: unknown 12755 1727204128.76945: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 12755 1727204128.77037: variable 'ansible_facts' from source: unknown 12755 1727204128.77230: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204128.732297-15740-179517321363041/AnsiballZ_package_facts.py 12755 1727204128.77577: Sending initial data 12755 1727204128.77582: Sent initial data (161 bytes) 12755 1727204128.78040: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204128.78051: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204128.78137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204128.78143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204128.78147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204128.78225: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204128.78229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204128.78325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204128.80190: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204128.80231: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204128.80302: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpj9q6dd6f /root/.ansible/tmp/ansible-tmp-1727204128.732297-15740-179517321363041/AnsiballZ_package_facts.py <<< 12755 1727204128.80305: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204128.732297-15740-179517321363041/AnsiballZ_package_facts.py" <<< 12755 1727204128.80336: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpj9q6dd6f" to remote "/root/.ansible/tmp/ansible-tmp-1727204128.732297-15740-179517321363041/AnsiballZ_package_facts.py" <<< 12755 1727204128.80345: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204128.732297-15740-179517321363041/AnsiballZ_package_facts.py" <<< 12755 1727204128.82402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204128.82409: stderr chunk (state=3): >>><<< 12755 1727204128.82418: stdout chunk (state=3): >>><<< 12755 1727204128.82442: done transferring module to remote 12755 1727204128.82453: _low_level_execute_command(): starting 12755 1727204128.82458: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204128.732297-15740-179517321363041/ /root/.ansible/tmp/ansible-tmp-1727204128.732297-15740-179517321363041/AnsiballZ_package_facts.py && sleep 0' 12755 1727204128.82891: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204128.82925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204128.82928: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204128.82931: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 12755 1727204128.82934: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204128.82992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204128.82998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204128.83042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204128.85122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204128.85194: stderr chunk (state=3): >>><<< 12755 1727204128.85197: stdout chunk (state=3): >>><<< 12755 1727204128.85207: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204128.85212: _low_level_execute_command(): starting 12755 1727204128.85217: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204128.732297-15740-179517321363041/AnsiballZ_package_facts.py && sleep 0' 12755 1727204128.85655: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204128.85704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204128.85707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204128.85710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12755 1727204128.85712: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204128.85715: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204128.85757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204128.85763: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204128.85825: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204129.54783: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 12755 1727204129.54997: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": <<< 12755 1727204129.55008: stdout chunk (state=3): >>>"rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 12755 1727204129.55026: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": <<< 12755 1727204129.55108: stdout chunk (state=3): >>>"perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": nul<<< 12755 1727204129.55118: stdout chunk (state=3): >>>l, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 12755 1727204129.57215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204129.57316: stderr chunk (state=3): >>>Shared connection to 10.31.11.210 closed. <<< 12755 1727204129.57342: stderr chunk (state=3): >>><<< 12755 1727204129.57368: stdout chunk (state=3): >>><<< 12755 1727204129.57459: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204129.61213: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204128.732297-15740-179517321363041/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204129.61230: _low_level_execute_command(): starting 12755 1727204129.61236: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204128.732297-15740-179517321363041/ > /dev/null 2>&1 && sleep 0' 12755 1727204129.61951: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204129.61971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204129.64149: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204129.64153: stdout chunk (state=3): >>><<< 12755 1727204129.64155: stderr chunk (state=3): >>><<< 12755 1727204129.64174: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204129.64188: handler run complete 12755 1727204129.65896: variable 'ansible_facts' from source: unknown 12755 1727204129.73625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204129.77584: variable 'ansible_facts' from source: unknown 12755 1727204129.78343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204129.80374: attempt loop complete, returning result 12755 1727204129.80433: _execute() done 12755 1727204129.80466: dumping result to json 12755 1727204129.81125: done dumping result, returning 12755 1727204129.81192: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-72e9-1a19-000000000855] 12755 1727204129.81209: sending task result for task 12b410aa-8751-72e9-1a19-000000000855 12755 1727204129.93544: done sending task result for task 12b410aa-8751-72e9-1a19-000000000855 12755 1727204129.93548: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204129.93678: no more pending results, returning what we have 12755 1727204129.93681: results queue empty 12755 1727204129.93682: checking for any_errors_fatal 12755 1727204129.93689: done checking for any_errors_fatal 12755 1727204129.93690: checking for max_fail_percentage 12755 1727204129.93691: done checking for max_fail_percentage 12755 1727204129.93692: checking to see if all hosts have failed and the running result is not ok 12755 1727204129.93693: done checking to see if all hosts have failed 12755 1727204129.93694: getting the remaining hosts for this loop 12755 1727204129.93696: done getting the remaining hosts for this loop 12755 1727204129.93702: getting the next task for host managed-node1 12755 1727204129.93708: done getting next task for host managed-node1 12755 1727204129.93712: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12755 1727204129.93715: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204129.93727: getting variables 12755 1727204129.93728: in VariableManager get_vars() 12755 1727204129.93764: Calling all_inventory to load vars for managed-node1 12755 1727204129.93768: Calling groups_inventory to load vars for managed-node1 12755 1727204129.93771: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204129.93779: Calling all_plugins_play to load vars for managed-node1 12755 1727204129.93783: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204129.93787: Calling groups_plugins_play to load vars for managed-node1 12755 1727204129.94994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204129.96911: done with get_vars() 12755 1727204129.96950: done getting variables 12755 1727204129.97012: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:55:29 -0400 (0:00:01.297) 0:00:55.206 ***** 12755 1727204129.97047: entering _queue_task() for managed-node1/debug 12755 1727204129.97424: worker is 1 (out of 1 available) 12755 1727204129.97438: exiting _queue_task() for managed-node1/debug 12755 1727204129.97453: done queuing things up, now waiting for results queue to drain 12755 1727204129.97454: waiting for pending results... 12755 1727204129.97821: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 12755 1727204129.97982: in run() - task 12b410aa-8751-72e9-1a19-00000000011c 12755 1727204129.98012: variable 'ansible_search_path' from source: unknown 12755 1727204129.98095: variable 'ansible_search_path' from source: unknown 12755 1727204129.98102: calling self._execute() 12755 1727204129.98203: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204129.98226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204129.98249: variable 'omit' from source: magic vars 12755 1727204129.99209: variable 'ansible_distribution_major_version' from source: facts 12755 1727204129.99228: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204129.99236: variable 'omit' from source: magic vars 12755 1727204129.99297: variable 'omit' from source: magic vars 12755 1727204129.99378: variable 'network_provider' from source: set_fact 12755 1727204129.99395: variable 'omit' from source: magic vars 12755 1727204129.99437: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204129.99488: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204129.99514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204129.99533: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204129.99548: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204129.99582: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204129.99586: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204129.99591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204129.99686: Set connection var ansible_connection to ssh 12755 1727204129.99694: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204129.99697: Set connection var ansible_shell_type to sh 12755 1727204129.99708: Set connection var ansible_timeout to 10 12755 1727204129.99717: Set connection var ansible_shell_executable to /bin/sh 12755 1727204129.99745: Set connection var ansible_pipelining to False 12755 1727204129.99770: variable 'ansible_shell_executable' from source: unknown 12755 1727204129.99775: variable 'ansible_connection' from source: unknown 12755 1727204129.99778: variable 'ansible_module_compression' from source: unknown 12755 1727204129.99780: variable 'ansible_shell_type' from source: unknown 12755 1727204129.99783: variable 'ansible_shell_executable' from source: unknown 12755 1727204129.99786: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204129.99790: variable 'ansible_pipelining' from source: unknown 12755 1727204129.99793: variable 'ansible_timeout' from source: unknown 12755 1727204129.99796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204129.99948: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204129.99960: variable 'omit' from source: magic vars 12755 1727204129.99967: starting attempt loop 12755 1727204129.99970: running the handler 12755 1727204130.00016: handler run complete 12755 1727204130.00030: attempt loop complete, returning result 12755 1727204130.00033: _execute() done 12755 1727204130.00036: dumping result to json 12755 1727204130.00040: done dumping result, returning 12755 1727204130.00049: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-72e9-1a19-00000000011c] 12755 1727204130.00053: sending task result for task 12b410aa-8751-72e9-1a19-00000000011c 12755 1727204130.00149: done sending task result for task 12b410aa-8751-72e9-1a19-00000000011c 12755 1727204130.00152: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: Using network provider: nm 12755 1727204130.00252: no more pending results, returning what we have 12755 1727204130.00256: results queue empty 12755 1727204130.00257: checking for any_errors_fatal 12755 1727204130.00270: done checking for any_errors_fatal 12755 1727204130.00271: checking for max_fail_percentage 12755 1727204130.00272: done checking for max_fail_percentage 12755 1727204130.00274: checking to see if all hosts have failed and the running result is not ok 12755 1727204130.00275: done checking to see if all hosts have failed 12755 1727204130.00276: getting the remaining hosts for this loop 12755 1727204130.00277: done getting the remaining hosts for this loop 12755 1727204130.00282: getting the next task for host managed-node1 12755 1727204130.00303: done getting next task for host managed-node1 12755 1727204130.00309: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12755 1727204130.00312: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204130.00324: getting variables 12755 1727204130.00326: in VariableManager get_vars() 12755 1727204130.00378: Calling all_inventory to load vars for managed-node1 12755 1727204130.00382: Calling groups_inventory to load vars for managed-node1 12755 1727204130.00384: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204130.00396: Calling all_plugins_play to load vars for managed-node1 12755 1727204130.00399: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204130.00403: Calling groups_plugins_play to load vars for managed-node1 12755 1727204130.02267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204130.04587: done with get_vars() 12755 1727204130.04641: done getting variables 12755 1727204130.04698: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:55:30 -0400 (0:00:00.076) 0:00:55.283 ***** 12755 1727204130.04741: entering _queue_task() for managed-node1/fail 12755 1727204130.05049: worker is 1 (out of 1 available) 12755 1727204130.05065: exiting _queue_task() for managed-node1/fail 12755 1727204130.05087: done queuing things up, now waiting for results queue to drain 12755 1727204130.05090: waiting for pending results... 12755 1727204130.05391: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12755 1727204130.05463: in run() - task 12b410aa-8751-72e9-1a19-00000000011d 12755 1727204130.05479: variable 'ansible_search_path' from source: unknown 12755 1727204130.05484: variable 'ansible_search_path' from source: unknown 12755 1727204130.05529: calling self._execute() 12755 1727204130.05631: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204130.05639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204130.05649: variable 'omit' from source: magic vars 12755 1727204130.06024: variable 'ansible_distribution_major_version' from source: facts 12755 1727204130.06035: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204130.06167: variable 'network_state' from source: role '' defaults 12755 1727204130.06193: Evaluated conditional (network_state != {}): False 12755 1727204130.06198: when evaluation is False, skipping this task 12755 1727204130.06201: _execute() done 12755 1727204130.06204: dumping result to json 12755 1727204130.06207: done dumping result, returning 12755 1727204130.06213: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-72e9-1a19-00000000011d] 12755 1727204130.06217: sending task result for task 12b410aa-8751-72e9-1a19-00000000011d skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204130.06429: no more pending results, returning what we have 12755 1727204130.06432: results queue empty 12755 1727204130.06433: checking for any_errors_fatal 12755 1727204130.06444: done checking for any_errors_fatal 12755 1727204130.06445: checking for max_fail_percentage 12755 1727204130.06447: done checking for max_fail_percentage 12755 1727204130.06448: checking to see if all hosts have failed and the running result is not ok 12755 1727204130.06449: done checking to see if all hosts have failed 12755 1727204130.06450: getting the remaining hosts for this loop 12755 1727204130.06451: done getting the remaining hosts for this loop 12755 1727204130.06457: getting the next task for host managed-node1 12755 1727204130.06465: done getting next task for host managed-node1 12755 1727204130.06469: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12755 1727204130.06472: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204130.06493: getting variables 12755 1727204130.06495: in VariableManager get_vars() 12755 1727204130.06547: Calling all_inventory to load vars for managed-node1 12755 1727204130.06554: Calling groups_inventory to load vars for managed-node1 12755 1727204130.06560: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204130.06572: Calling all_plugins_play to load vars for managed-node1 12755 1727204130.06576: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204130.06580: Calling groups_plugins_play to load vars for managed-node1 12755 1727204130.07095: done sending task result for task 12b410aa-8751-72e9-1a19-00000000011d 12755 1727204130.07098: WORKER PROCESS EXITING 12755 1727204130.08074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204130.09998: done with get_vars() 12755 1727204130.10029: done getting variables 12755 1727204130.10107: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:55:30 -0400 (0:00:00.053) 0:00:55.337 ***** 12755 1727204130.10142: entering _queue_task() for managed-node1/fail 12755 1727204130.10443: worker is 1 (out of 1 available) 12755 1727204130.10458: exiting _queue_task() for managed-node1/fail 12755 1727204130.10472: done queuing things up, now waiting for results queue to drain 12755 1727204130.10474: waiting for pending results... 12755 1727204130.10765: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12755 1727204130.10893: in run() - task 12b410aa-8751-72e9-1a19-00000000011e 12755 1727204130.10906: variable 'ansible_search_path' from source: unknown 12755 1727204130.10910: variable 'ansible_search_path' from source: unknown 12755 1727204130.10971: calling self._execute() 12755 1727204130.11064: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204130.11071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204130.11082: variable 'omit' from source: magic vars 12755 1727204130.11466: variable 'ansible_distribution_major_version' from source: facts 12755 1727204130.11490: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204130.11607: variable 'network_state' from source: role '' defaults 12755 1727204130.11621: Evaluated conditional (network_state != {}): False 12755 1727204130.11626: when evaluation is False, skipping this task 12755 1727204130.11629: _execute() done 12755 1727204130.11632: dumping result to json 12755 1727204130.11635: done dumping result, returning 12755 1727204130.11646: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-72e9-1a19-00000000011e] 12755 1727204130.11651: sending task result for task 12b410aa-8751-72e9-1a19-00000000011e 12755 1727204130.11766: done sending task result for task 12b410aa-8751-72e9-1a19-00000000011e 12755 1727204130.11769: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204130.11863: no more pending results, returning what we have 12755 1727204130.11867: results queue empty 12755 1727204130.11868: checking for any_errors_fatal 12755 1727204130.11876: done checking for any_errors_fatal 12755 1727204130.11877: checking for max_fail_percentage 12755 1727204130.11879: done checking for max_fail_percentage 12755 1727204130.11880: checking to see if all hosts have failed and the running result is not ok 12755 1727204130.11881: done checking to see if all hosts have failed 12755 1727204130.11882: getting the remaining hosts for this loop 12755 1727204130.11884: done getting the remaining hosts for this loop 12755 1727204130.11890: getting the next task for host managed-node1 12755 1727204130.11899: done getting next task for host managed-node1 12755 1727204130.11903: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12755 1727204130.11906: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204130.11926: getting variables 12755 1727204130.11928: in VariableManager get_vars() 12755 1727204130.11980: Calling all_inventory to load vars for managed-node1 12755 1727204130.11984: Calling groups_inventory to load vars for managed-node1 12755 1727204130.11986: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204130.12032: Calling all_plugins_play to load vars for managed-node1 12755 1727204130.12036: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204130.12039: Calling groups_plugins_play to load vars for managed-node1 12755 1727204130.13315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204130.15103: done with get_vars() 12755 1727204130.15129: done getting variables 12755 1727204130.15180: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:55:30 -0400 (0:00:00.050) 0:00:55.387 ***** 12755 1727204130.15215: entering _queue_task() for managed-node1/fail 12755 1727204130.15551: worker is 1 (out of 1 available) 12755 1727204130.15567: exiting _queue_task() for managed-node1/fail 12755 1727204130.15581: done queuing things up, now waiting for results queue to drain 12755 1727204130.15583: waiting for pending results... 12755 1727204130.15831: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12755 1727204130.15973: in run() - task 12b410aa-8751-72e9-1a19-00000000011f 12755 1727204130.15977: variable 'ansible_search_path' from source: unknown 12755 1727204130.15980: variable 'ansible_search_path' from source: unknown 12755 1727204130.16021: calling self._execute() 12755 1727204130.16113: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204130.16123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204130.16136: variable 'omit' from source: magic vars 12755 1727204130.16483: variable 'ansible_distribution_major_version' from source: facts 12755 1727204130.16496: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204130.16659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204130.19268: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204130.19362: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204130.19407: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204130.19474: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204130.19497: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204130.19573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204130.19619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204130.19647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204130.19681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204130.19695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204130.19843: variable 'ansible_distribution_major_version' from source: facts 12755 1727204130.19856: Evaluated conditional (ansible_distribution_major_version | int > 9): True 12755 1727204130.20013: variable 'ansible_distribution' from source: facts 12755 1727204130.20018: variable '__network_rh_distros' from source: role '' defaults 12755 1727204130.20037: Evaluated conditional (ansible_distribution in __network_rh_distros): False 12755 1727204130.20041: when evaluation is False, skipping this task 12755 1727204130.20044: _execute() done 12755 1727204130.20047: dumping result to json 12755 1727204130.20050: done dumping result, returning 12755 1727204130.20100: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-72e9-1a19-00000000011f] 12755 1727204130.20104: sending task result for task 12b410aa-8751-72e9-1a19-00000000011f 12755 1727204130.20199: done sending task result for task 12b410aa-8751-72e9-1a19-00000000011f 12755 1727204130.20203: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 12755 1727204130.20282: no more pending results, returning what we have 12755 1727204130.20285: results queue empty 12755 1727204130.20287: checking for any_errors_fatal 12755 1727204130.20299: done checking for any_errors_fatal 12755 1727204130.20300: checking for max_fail_percentage 12755 1727204130.20302: done checking for max_fail_percentage 12755 1727204130.20303: checking to see if all hosts have failed and the running result is not ok 12755 1727204130.20304: done checking to see if all hosts have failed 12755 1727204130.20305: getting the remaining hosts for this loop 12755 1727204130.20307: done getting the remaining hosts for this loop 12755 1727204130.20312: getting the next task for host managed-node1 12755 1727204130.20320: done getting next task for host managed-node1 12755 1727204130.20325: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12755 1727204130.20328: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204130.20354: getting variables 12755 1727204130.20356: in VariableManager get_vars() 12755 1727204130.20466: Calling all_inventory to load vars for managed-node1 12755 1727204130.20551: Calling groups_inventory to load vars for managed-node1 12755 1727204130.20555: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204130.20570: Calling all_plugins_play to load vars for managed-node1 12755 1727204130.20574: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204130.20578: Calling groups_plugins_play to load vars for managed-node1 12755 1727204130.22206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204130.23899: done with get_vars() 12755 1727204130.23935: done getting variables 12755 1727204130.23992: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:55:30 -0400 (0:00:00.088) 0:00:55.475 ***** 12755 1727204130.24025: entering _queue_task() for managed-node1/dnf 12755 1727204130.24382: worker is 1 (out of 1 available) 12755 1727204130.24400: exiting _queue_task() for managed-node1/dnf 12755 1727204130.24415: done queuing things up, now waiting for results queue to drain 12755 1727204130.24417: waiting for pending results... 12755 1727204130.24655: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12755 1727204130.24781: in run() - task 12b410aa-8751-72e9-1a19-000000000120 12755 1727204130.24797: variable 'ansible_search_path' from source: unknown 12755 1727204130.24800: variable 'ansible_search_path' from source: unknown 12755 1727204130.24865: calling self._execute() 12755 1727204130.24943: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204130.24950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204130.24962: variable 'omit' from source: magic vars 12755 1727204130.25298: variable 'ansible_distribution_major_version' from source: facts 12755 1727204130.25313: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204130.25484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204130.27565: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204130.27626: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204130.27676: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204130.27703: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204130.27728: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204130.27835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204130.27904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204130.27950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204130.28037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204130.28060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204130.28221: variable 'ansible_distribution' from source: facts 12755 1727204130.28229: variable 'ansible_distribution_major_version' from source: facts 12755 1727204130.28235: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 12755 1727204130.28412: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204130.28523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204130.28540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204130.28561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204130.28593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204130.28606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204130.28647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204130.28666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204130.28686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204130.28720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204130.28737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204130.28771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204130.28792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204130.28814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204130.28846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204130.28861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204130.28999: variable 'network_connections' from source: task vars 12755 1727204130.29016: variable 'port1_profile' from source: play vars 12755 1727204130.29188: variable 'port1_profile' from source: play vars 12755 1727204130.29193: variable 'port2_profile' from source: play vars 12755 1727204130.29196: variable 'port2_profile' from source: play vars 12755 1727204130.29258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204130.29794: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204130.29798: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204130.29803: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204130.29806: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204130.29811: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204130.29814: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204130.29823: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204130.29826: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204130.29828: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204130.30797: variable 'network_connections' from source: task vars 12755 1727204130.30809: variable 'port1_profile' from source: play vars 12755 1727204130.30907: variable 'port1_profile' from source: play vars 12755 1727204130.30927: variable 'port2_profile' from source: play vars 12755 1727204130.31037: variable 'port2_profile' from source: play vars 12755 1727204130.31061: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12755 1727204130.31070: when evaluation is False, skipping this task 12755 1727204130.31096: _execute() done 12755 1727204130.31099: dumping result to json 12755 1727204130.31101: done dumping result, returning 12755 1727204130.31113: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-72e9-1a19-000000000120] 12755 1727204130.31123: sending task result for task 12b410aa-8751-72e9-1a19-000000000120 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12755 1727204130.31429: no more pending results, returning what we have 12755 1727204130.31433: results queue empty 12755 1727204130.31434: checking for any_errors_fatal 12755 1727204130.31442: done checking for any_errors_fatal 12755 1727204130.31444: checking for max_fail_percentage 12755 1727204130.31446: done checking for max_fail_percentage 12755 1727204130.31447: checking to see if all hosts have failed and the running result is not ok 12755 1727204130.31448: done checking to see if all hosts have failed 12755 1727204130.31449: getting the remaining hosts for this loop 12755 1727204130.31451: done getting the remaining hosts for this loop 12755 1727204130.31456: getting the next task for host managed-node1 12755 1727204130.31504: done getting next task for host managed-node1 12755 1727204130.31511: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12755 1727204130.31514: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204130.31540: getting variables 12755 1727204130.31543: in VariableManager get_vars() 12755 1727204130.31744: Calling all_inventory to load vars for managed-node1 12755 1727204130.31748: Calling groups_inventory to load vars for managed-node1 12755 1727204130.31751: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204130.31809: Calling all_plugins_play to load vars for managed-node1 12755 1727204130.31815: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204130.31820: Calling groups_plugins_play to load vars for managed-node1 12755 1727204130.32454: done sending task result for task 12b410aa-8751-72e9-1a19-000000000120 12755 1727204130.32458: WORKER PROCESS EXITING 12755 1727204130.33687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204130.37110: done with get_vars() 12755 1727204130.37168: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12755 1727204130.37263: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:55:30 -0400 (0:00:00.132) 0:00:55.608 ***** 12755 1727204130.37303: entering _queue_task() for managed-node1/yum 12755 1727204130.37665: worker is 1 (out of 1 available) 12755 1727204130.37680: exiting _queue_task() for managed-node1/yum 12755 1727204130.37698: done queuing things up, now waiting for results queue to drain 12755 1727204130.37699: waiting for pending results... 12755 1727204130.38026: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12755 1727204130.38352: in run() - task 12b410aa-8751-72e9-1a19-000000000121 12755 1727204130.38358: variable 'ansible_search_path' from source: unknown 12755 1727204130.38362: variable 'ansible_search_path' from source: unknown 12755 1727204130.38365: calling self._execute() 12755 1727204130.38798: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204130.38805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204130.38812: variable 'omit' from source: magic vars 12755 1727204130.39421: variable 'ansible_distribution_major_version' from source: facts 12755 1727204130.39434: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204130.39920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204130.45125: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204130.45318: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204130.45363: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204130.45572: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204130.45575: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204130.45797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204130.45928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204130.46079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204130.46132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204130.46149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204130.46349: variable 'ansible_distribution_major_version' from source: facts 12755 1727204130.46369: Evaluated conditional (ansible_distribution_major_version | int < 8): False 12755 1727204130.46493: when evaluation is False, skipping this task 12755 1727204130.46497: _execute() done 12755 1727204130.46500: dumping result to json 12755 1727204130.46506: done dumping result, returning 12755 1727204130.46518: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-72e9-1a19-000000000121] 12755 1727204130.46524: sending task result for task 12b410aa-8751-72e9-1a19-000000000121 12755 1727204130.46774: done sending task result for task 12b410aa-8751-72e9-1a19-000000000121 12755 1727204130.46778: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 12755 1727204130.46842: no more pending results, returning what we have 12755 1727204130.46846: results queue empty 12755 1727204130.46848: checking for any_errors_fatal 12755 1727204130.46859: done checking for any_errors_fatal 12755 1727204130.46860: checking for max_fail_percentage 12755 1727204130.46863: done checking for max_fail_percentage 12755 1727204130.46864: checking to see if all hosts have failed and the running result is not ok 12755 1727204130.46866: done checking to see if all hosts have failed 12755 1727204130.46867: getting the remaining hosts for this loop 12755 1727204130.46868: done getting the remaining hosts for this loop 12755 1727204130.46875: getting the next task for host managed-node1 12755 1727204130.46884: done getting next task for host managed-node1 12755 1727204130.46891: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12755 1727204130.46894: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204130.46919: getting variables 12755 1727204130.46921: in VariableManager get_vars() 12755 1727204130.46988: Calling all_inventory to load vars for managed-node1 12755 1727204130.47301: Calling groups_inventory to load vars for managed-node1 12755 1727204130.47306: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204130.47319: Calling all_plugins_play to load vars for managed-node1 12755 1727204130.47322: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204130.47326: Calling groups_plugins_play to load vars for managed-node1 12755 1727204130.49913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204130.53482: done with get_vars() 12755 1727204130.53527: done getting variables 12755 1727204130.53604: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:55:30 -0400 (0:00:00.163) 0:00:55.772 ***** 12755 1727204130.53646: entering _queue_task() for managed-node1/fail 12755 1727204130.54018: worker is 1 (out of 1 available) 12755 1727204130.54035: exiting _queue_task() for managed-node1/fail 12755 1727204130.54050: done queuing things up, now waiting for results queue to drain 12755 1727204130.54051: waiting for pending results... 12755 1727204130.54328: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12755 1727204130.54527: in run() - task 12b410aa-8751-72e9-1a19-000000000122 12755 1727204130.54537: variable 'ansible_search_path' from source: unknown 12755 1727204130.54546: variable 'ansible_search_path' from source: unknown 12755 1727204130.54633: calling self._execute() 12755 1727204130.54725: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204130.54744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204130.54768: variable 'omit' from source: magic vars 12755 1727204130.55253: variable 'ansible_distribution_major_version' from source: facts 12755 1727204130.55272: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204130.55453: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204130.55795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204130.61629: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204130.62097: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204130.62104: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204130.62115: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204130.62242: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204130.62417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204130.62680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204130.62685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204130.62691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204130.62808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204130.62876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204130.63136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204130.63174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204130.63285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204130.63358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204130.63504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204130.63538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204130.63696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204130.63755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204130.63829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204130.64350: variable 'network_connections' from source: task vars 12755 1727204130.64375: variable 'port1_profile' from source: play vars 12755 1727204130.64686: variable 'port1_profile' from source: play vars 12755 1727204130.64692: variable 'port2_profile' from source: play vars 12755 1727204130.64734: variable 'port2_profile' from source: play vars 12755 1727204130.65024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204130.65522: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204130.65694: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204130.65740: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204130.65786: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204130.65932: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204130.65964: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204130.66199: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204130.66203: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204130.66327: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204130.67178: variable 'network_connections' from source: task vars 12755 1727204130.67182: variable 'port1_profile' from source: play vars 12755 1727204130.67185: variable 'port1_profile' from source: play vars 12755 1727204130.67298: variable 'port2_profile' from source: play vars 12755 1727204130.67377: variable 'port2_profile' from source: play vars 12755 1727204130.67532: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12755 1727204130.67536: when evaluation is False, skipping this task 12755 1727204130.67539: _execute() done 12755 1727204130.67545: dumping result to json 12755 1727204130.67549: done dumping result, returning 12755 1727204130.67560: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-72e9-1a19-000000000122] 12755 1727204130.67571: sending task result for task 12b410aa-8751-72e9-1a19-000000000122 12755 1727204130.67710: done sending task result for task 12b410aa-8751-72e9-1a19-000000000122 12755 1727204130.67713: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12755 1727204130.67773: no more pending results, returning what we have 12755 1727204130.67778: results queue empty 12755 1727204130.67779: checking for any_errors_fatal 12755 1727204130.67787: done checking for any_errors_fatal 12755 1727204130.67788: checking for max_fail_percentage 12755 1727204130.67792: done checking for max_fail_percentage 12755 1727204130.67793: checking to see if all hosts have failed and the running result is not ok 12755 1727204130.67795: done checking to see if all hosts have failed 12755 1727204130.67796: getting the remaining hosts for this loop 12755 1727204130.67797: done getting the remaining hosts for this loop 12755 1727204130.67803: getting the next task for host managed-node1 12755 1727204130.67812: done getting next task for host managed-node1 12755 1727204130.67816: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12755 1727204130.67819: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204130.67845: getting variables 12755 1727204130.67847: in VariableManager get_vars() 12755 1727204130.68120: Calling all_inventory to load vars for managed-node1 12755 1727204130.68123: Calling groups_inventory to load vars for managed-node1 12755 1727204130.68127: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204130.68139: Calling all_plugins_play to load vars for managed-node1 12755 1727204130.68143: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204130.68146: Calling groups_plugins_play to load vars for managed-node1 12755 1727204130.72907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204130.78949: done with get_vars() 12755 1727204130.78995: done getting variables 12755 1727204130.79071: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:55:30 -0400 (0:00:00.256) 0:00:56.028 ***** 12755 1727204130.79320: entering _queue_task() for managed-node1/package 12755 1727204130.79944: worker is 1 (out of 1 available) 12755 1727204130.79958: exiting _queue_task() for managed-node1/package 12755 1727204130.79972: done queuing things up, now waiting for results queue to drain 12755 1727204130.79973: waiting for pending results... 12755 1727204130.80402: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 12755 1727204130.80796: in run() - task 12b410aa-8751-72e9-1a19-000000000123 12755 1727204130.80800: variable 'ansible_search_path' from source: unknown 12755 1727204130.80803: variable 'ansible_search_path' from source: unknown 12755 1727204130.80861: calling self._execute() 12755 1727204130.81087: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204130.81096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204130.81188: variable 'omit' from source: magic vars 12755 1727204130.82132: variable 'ansible_distribution_major_version' from source: facts 12755 1727204130.82145: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204130.82784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204130.83604: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204130.83840: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204130.83881: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204130.84066: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204130.84327: variable 'network_packages' from source: role '' defaults 12755 1727204130.84763: variable '__network_provider_setup' from source: role '' defaults 12755 1727204130.84766: variable '__network_service_name_default_nm' from source: role '' defaults 12755 1727204130.84908: variable '__network_service_name_default_nm' from source: role '' defaults 12755 1727204130.84922: variable '__network_packages_default_nm' from source: role '' defaults 12755 1727204130.85099: variable '__network_packages_default_nm' from source: role '' defaults 12755 1727204130.85650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204130.88699: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204130.88896: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204130.88900: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204130.88902: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204130.88905: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204130.88951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204130.89005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204130.89039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204130.89297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204130.89300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204130.89303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204130.89305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204130.89308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204130.89403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204130.89596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204130.89764: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12755 1727204130.89912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204130.89945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204130.90033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204130.90037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204130.90045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204130.90161: variable 'ansible_python' from source: facts 12755 1727204130.90203: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12755 1727204130.90309: variable '__network_wpa_supplicant_required' from source: role '' defaults 12755 1727204130.90402: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12755 1727204130.90823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204130.90859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204130.90890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204130.90941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204130.90958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204130.91065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204130.91097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204130.91373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204130.91427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204130.91560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204130.91760: variable 'network_connections' from source: task vars 12755 1727204130.91776: variable 'port1_profile' from source: play vars 12755 1727204130.92007: variable 'port1_profile' from source: play vars 12755 1727204130.92010: variable 'port2_profile' from source: play vars 12755 1727204130.92052: variable 'port2_profile' from source: play vars 12755 1727204130.92143: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204130.92174: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204130.92223: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204130.92263: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204130.92333: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204130.92756: variable 'network_connections' from source: task vars 12755 1727204130.92762: variable 'port1_profile' from source: play vars 12755 1727204130.92945: variable 'port1_profile' from source: play vars 12755 1727204130.92957: variable 'port2_profile' from source: play vars 12755 1727204130.93088: variable 'port2_profile' from source: play vars 12755 1727204130.93206: variable '__network_packages_default_wireless' from source: role '' defaults 12755 1727204130.93353: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204130.93808: variable 'network_connections' from source: task vars 12755 1727204130.93816: variable 'port1_profile' from source: play vars 12755 1727204130.93903: variable 'port1_profile' from source: play vars 12755 1727204130.93913: variable 'port2_profile' from source: play vars 12755 1727204130.94007: variable 'port2_profile' from source: play vars 12755 1727204130.94034: variable '__network_packages_default_team' from source: role '' defaults 12755 1727204130.94145: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204130.94579: variable 'network_connections' from source: task vars 12755 1727204130.94585: variable 'port1_profile' from source: play vars 12755 1727204130.94673: variable 'port1_profile' from source: play vars 12755 1727204130.94682: variable 'port2_profile' from source: play vars 12755 1727204130.94772: variable 'port2_profile' from source: play vars 12755 1727204130.94855: variable '__network_service_name_default_initscripts' from source: role '' defaults 12755 1727204130.94937: variable '__network_service_name_default_initscripts' from source: role '' defaults 12755 1727204130.94949: variable '__network_packages_default_initscripts' from source: role '' defaults 12755 1727204130.95025: variable '__network_packages_default_initscripts' from source: role '' defaults 12755 1727204130.95351: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12755 1727204130.96309: variable 'network_connections' from source: task vars 12755 1727204130.96313: variable 'port1_profile' from source: play vars 12755 1727204130.96316: variable 'port1_profile' from source: play vars 12755 1727204130.96319: variable 'port2_profile' from source: play vars 12755 1727204130.96321: variable 'port2_profile' from source: play vars 12755 1727204130.96323: variable 'ansible_distribution' from source: facts 12755 1727204130.96326: variable '__network_rh_distros' from source: role '' defaults 12755 1727204130.96695: variable 'ansible_distribution_major_version' from source: facts 12755 1727204130.96699: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12755 1727204130.96701: variable 'ansible_distribution' from source: facts 12755 1727204130.96704: variable '__network_rh_distros' from source: role '' defaults 12755 1727204130.96706: variable 'ansible_distribution_major_version' from source: facts 12755 1727204130.96708: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12755 1727204130.96873: variable 'ansible_distribution' from source: facts 12755 1727204130.96883: variable '__network_rh_distros' from source: role '' defaults 12755 1727204130.96893: variable 'ansible_distribution_major_version' from source: facts 12755 1727204130.96941: variable 'network_provider' from source: set_fact 12755 1727204130.96960: variable 'ansible_facts' from source: unknown 12755 1727204130.98280: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 12755 1727204130.98283: when evaluation is False, skipping this task 12755 1727204130.98286: _execute() done 12755 1727204130.98291: dumping result to json 12755 1727204130.98309: done dumping result, returning 12755 1727204130.98321: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-72e9-1a19-000000000123] 12755 1727204130.98327: sending task result for task 12b410aa-8751-72e9-1a19-000000000123 12755 1727204130.98437: done sending task result for task 12b410aa-8751-72e9-1a19-000000000123 12755 1727204130.98441: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 12755 1727204130.98515: no more pending results, returning what we have 12755 1727204130.98520: results queue empty 12755 1727204130.98521: checking for any_errors_fatal 12755 1727204130.98529: done checking for any_errors_fatal 12755 1727204130.98530: checking for max_fail_percentage 12755 1727204130.98532: done checking for max_fail_percentage 12755 1727204130.98533: checking to see if all hosts have failed and the running result is not ok 12755 1727204130.98534: done checking to see if all hosts have failed 12755 1727204130.98535: getting the remaining hosts for this loop 12755 1727204130.98536: done getting the remaining hosts for this loop 12755 1727204130.98541: getting the next task for host managed-node1 12755 1727204130.98550: done getting next task for host managed-node1 12755 1727204130.98554: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12755 1727204130.98557: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204130.98586: getting variables 12755 1727204130.98587: in VariableManager get_vars() 12755 1727204130.98645: Calling all_inventory to load vars for managed-node1 12755 1727204130.98649: Calling groups_inventory to load vars for managed-node1 12755 1727204130.98652: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204130.98664: Calling all_plugins_play to load vars for managed-node1 12755 1727204130.98667: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204130.98671: Calling groups_plugins_play to load vars for managed-node1 12755 1727204131.01633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204131.04804: done with get_vars() 12755 1727204131.04846: done getting variables 12755 1727204131.04927: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:55:31 -0400 (0:00:00.256) 0:00:56.285 ***** 12755 1727204131.04968: entering _queue_task() for managed-node1/package 12755 1727204131.05357: worker is 1 (out of 1 available) 12755 1727204131.05377: exiting _queue_task() for managed-node1/package 12755 1727204131.05597: done queuing things up, now waiting for results queue to drain 12755 1727204131.05599: waiting for pending results... 12755 1727204131.05810: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12755 1727204131.05939: in run() - task 12b410aa-8751-72e9-1a19-000000000124 12755 1727204131.05967: variable 'ansible_search_path' from source: unknown 12755 1727204131.05977: variable 'ansible_search_path' from source: unknown 12755 1727204131.06027: calling self._execute() 12755 1727204131.06157: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204131.06175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204131.06260: variable 'omit' from source: magic vars 12755 1727204131.06681: variable 'ansible_distribution_major_version' from source: facts 12755 1727204131.06712: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204131.06878: variable 'network_state' from source: role '' defaults 12755 1727204131.06898: Evaluated conditional (network_state != {}): False 12755 1727204131.06913: when evaluation is False, skipping this task 12755 1727204131.06930: _execute() done 12755 1727204131.06943: dumping result to json 12755 1727204131.06953: done dumping result, returning 12755 1727204131.07032: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-72e9-1a19-000000000124] 12755 1727204131.07036: sending task result for task 12b410aa-8751-72e9-1a19-000000000124 12755 1727204131.07122: done sending task result for task 12b410aa-8751-72e9-1a19-000000000124 12755 1727204131.07126: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204131.07196: no more pending results, returning what we have 12755 1727204131.07201: results queue empty 12755 1727204131.07202: checking for any_errors_fatal 12755 1727204131.07209: done checking for any_errors_fatal 12755 1727204131.07210: checking for max_fail_percentage 12755 1727204131.07213: done checking for max_fail_percentage 12755 1727204131.07214: checking to see if all hosts have failed and the running result is not ok 12755 1727204131.07215: done checking to see if all hosts have failed 12755 1727204131.07216: getting the remaining hosts for this loop 12755 1727204131.07218: done getting the remaining hosts for this loop 12755 1727204131.07223: getting the next task for host managed-node1 12755 1727204131.07232: done getting next task for host managed-node1 12755 1727204131.07236: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12755 1727204131.07240: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204131.07272: getting variables 12755 1727204131.07274: in VariableManager get_vars() 12755 1727204131.07447: Calling all_inventory to load vars for managed-node1 12755 1727204131.07451: Calling groups_inventory to load vars for managed-node1 12755 1727204131.07454: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204131.07472: Calling all_plugins_play to load vars for managed-node1 12755 1727204131.07477: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204131.07482: Calling groups_plugins_play to load vars for managed-node1 12755 1727204131.09800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204131.12707: done with get_vars() 12755 1727204131.12741: done getting variables 12755 1727204131.12805: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:55:31 -0400 (0:00:00.078) 0:00:56.364 ***** 12755 1727204131.12841: entering _queue_task() for managed-node1/package 12755 1727204131.13203: worker is 1 (out of 1 available) 12755 1727204131.13220: exiting _queue_task() for managed-node1/package 12755 1727204131.13235: done queuing things up, now waiting for results queue to drain 12755 1727204131.13237: waiting for pending results... 12755 1727204131.13621: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12755 1727204131.13896: in run() - task 12b410aa-8751-72e9-1a19-000000000125 12755 1727204131.13900: variable 'ansible_search_path' from source: unknown 12755 1727204131.13903: variable 'ansible_search_path' from source: unknown 12755 1727204131.13906: calling self._execute() 12755 1727204131.13951: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204131.13966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204131.13985: variable 'omit' from source: magic vars 12755 1727204131.14455: variable 'ansible_distribution_major_version' from source: facts 12755 1727204131.14476: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204131.14641: variable 'network_state' from source: role '' defaults 12755 1727204131.14660: Evaluated conditional (network_state != {}): False 12755 1727204131.14674: when evaluation is False, skipping this task 12755 1727204131.14684: _execute() done 12755 1727204131.14695: dumping result to json 12755 1727204131.14703: done dumping result, returning 12755 1727204131.14718: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-72e9-1a19-000000000125] 12755 1727204131.14730: sending task result for task 12b410aa-8751-72e9-1a19-000000000125 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204131.14903: no more pending results, returning what we have 12755 1727204131.14908: results queue empty 12755 1727204131.14910: checking for any_errors_fatal 12755 1727204131.14918: done checking for any_errors_fatal 12755 1727204131.14919: checking for max_fail_percentage 12755 1727204131.14922: done checking for max_fail_percentage 12755 1727204131.14923: checking to see if all hosts have failed and the running result is not ok 12755 1727204131.14925: done checking to see if all hosts have failed 12755 1727204131.14926: getting the remaining hosts for this loop 12755 1727204131.14928: done getting the remaining hosts for this loop 12755 1727204131.14933: getting the next task for host managed-node1 12755 1727204131.14942: done getting next task for host managed-node1 12755 1727204131.14946: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12755 1727204131.14951: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204131.14978: getting variables 12755 1727204131.14981: in VariableManager get_vars() 12755 1727204131.15049: Calling all_inventory to load vars for managed-node1 12755 1727204131.15053: Calling groups_inventory to load vars for managed-node1 12755 1727204131.15056: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204131.15072: Calling all_plugins_play to load vars for managed-node1 12755 1727204131.15076: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204131.15081: Calling groups_plugins_play to load vars for managed-node1 12755 1727204131.16106: done sending task result for task 12b410aa-8751-72e9-1a19-000000000125 12755 1727204131.16110: WORKER PROCESS EXITING 12755 1727204131.17645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204131.20577: done with get_vars() 12755 1727204131.20625: done getting variables 12755 1727204131.20697: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:55:31 -0400 (0:00:00.078) 0:00:56.443 ***** 12755 1727204131.20738: entering _queue_task() for managed-node1/service 12755 1727204131.21113: worker is 1 (out of 1 available) 12755 1727204131.21128: exiting _queue_task() for managed-node1/service 12755 1727204131.21142: done queuing things up, now waiting for results queue to drain 12755 1727204131.21144: waiting for pending results... 12755 1727204131.21468: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12755 1727204131.21674: in run() - task 12b410aa-8751-72e9-1a19-000000000126 12755 1727204131.21699: variable 'ansible_search_path' from source: unknown 12755 1727204131.21709: variable 'ansible_search_path' from source: unknown 12755 1727204131.21766: calling self._execute() 12755 1727204131.21887: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204131.21905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204131.21923: variable 'omit' from source: magic vars 12755 1727204131.22399: variable 'ansible_distribution_major_version' from source: facts 12755 1727204131.22418: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204131.22580: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204131.22867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204131.25493: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204131.25581: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204131.25629: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204131.25679: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204131.25724: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204131.25842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204131.25910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204131.25951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204131.26018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204131.26044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204131.26118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204131.26155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204131.26193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204131.26254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204131.26277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204131.26342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204131.26380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204131.26424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204131.26533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204131.26536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204131.26749: variable 'network_connections' from source: task vars 12755 1727204131.26770: variable 'port1_profile' from source: play vars 12755 1727204131.26862: variable 'port1_profile' from source: play vars 12755 1727204131.26880: variable 'port2_profile' from source: play vars 12755 1727204131.26960: variable 'port2_profile' from source: play vars 12755 1727204131.27061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204131.27276: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204131.27394: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204131.27398: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204131.27421: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204131.27479: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204131.27517: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204131.27555: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204131.27595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204131.27666: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204131.28016: variable 'network_connections' from source: task vars 12755 1727204131.28028: variable 'port1_profile' from source: play vars 12755 1727204131.28111: variable 'port1_profile' from source: play vars 12755 1727204131.28169: variable 'port2_profile' from source: play vars 12755 1727204131.28208: variable 'port2_profile' from source: play vars 12755 1727204131.28243: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12755 1727204131.28254: when evaluation is False, skipping this task 12755 1727204131.28263: _execute() done 12755 1727204131.28277: dumping result to json 12755 1727204131.28287: done dumping result, returning 12755 1727204131.28305: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-72e9-1a19-000000000126] 12755 1727204131.28387: sending task result for task 12b410aa-8751-72e9-1a19-000000000126 12755 1727204131.28470: done sending task result for task 12b410aa-8751-72e9-1a19-000000000126 12755 1727204131.28473: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12755 1727204131.28547: no more pending results, returning what we have 12755 1727204131.28552: results queue empty 12755 1727204131.28553: checking for any_errors_fatal 12755 1727204131.28561: done checking for any_errors_fatal 12755 1727204131.28562: checking for max_fail_percentage 12755 1727204131.28564: done checking for max_fail_percentage 12755 1727204131.28565: checking to see if all hosts have failed and the running result is not ok 12755 1727204131.28567: done checking to see if all hosts have failed 12755 1727204131.28568: getting the remaining hosts for this loop 12755 1727204131.28570: done getting the remaining hosts for this loop 12755 1727204131.28576: getting the next task for host managed-node1 12755 1727204131.28585: done getting next task for host managed-node1 12755 1727204131.28593: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12755 1727204131.28597: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204131.28622: getting variables 12755 1727204131.28625: in VariableManager get_vars() 12755 1727204131.28895: Calling all_inventory to load vars for managed-node1 12755 1727204131.28899: Calling groups_inventory to load vars for managed-node1 12755 1727204131.28903: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204131.28916: Calling all_plugins_play to load vars for managed-node1 12755 1727204131.28920: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204131.28924: Calling groups_plugins_play to load vars for managed-node1 12755 1727204131.30901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204131.33293: done with get_vars() 12755 1727204131.33333: done getting variables 12755 1727204131.33412: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:55:31 -0400 (0:00:00.127) 0:00:56.570 ***** 12755 1727204131.33453: entering _queue_task() for managed-node1/service 12755 1727204131.33949: worker is 1 (out of 1 available) 12755 1727204131.33964: exiting _queue_task() for managed-node1/service 12755 1727204131.33979: done queuing things up, now waiting for results queue to drain 12755 1727204131.33981: waiting for pending results... 12755 1727204131.34225: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12755 1727204131.34403: in run() - task 12b410aa-8751-72e9-1a19-000000000127 12755 1727204131.34413: variable 'ansible_search_path' from source: unknown 12755 1727204131.34417: variable 'ansible_search_path' from source: unknown 12755 1727204131.34419: calling self._execute() 12755 1727204131.34581: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204131.34586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204131.34595: variable 'omit' from source: magic vars 12755 1727204131.34949: variable 'ansible_distribution_major_version' from source: facts 12755 1727204131.34962: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204131.35106: variable 'network_provider' from source: set_fact 12755 1727204131.35117: variable 'network_state' from source: role '' defaults 12755 1727204131.35123: Evaluated conditional (network_provider == "nm" or network_state != {}): True 12755 1727204131.35130: variable 'omit' from source: magic vars 12755 1727204131.35186: variable 'omit' from source: magic vars 12755 1727204131.35215: variable 'network_service_name' from source: role '' defaults 12755 1727204131.35279: variable 'network_service_name' from source: role '' defaults 12755 1727204131.35386: variable '__network_provider_setup' from source: role '' defaults 12755 1727204131.35401: variable '__network_service_name_default_nm' from source: role '' defaults 12755 1727204131.35448: variable '__network_service_name_default_nm' from source: role '' defaults 12755 1727204131.35456: variable '__network_packages_default_nm' from source: role '' defaults 12755 1727204131.35514: variable '__network_packages_default_nm' from source: role '' defaults 12755 1727204131.35766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204131.38423: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204131.38465: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204131.38519: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204131.38575: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204131.38700: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204131.38705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204131.38742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204131.38796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204131.38825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204131.38843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204131.38943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204131.38948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204131.39040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204131.39044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204131.39047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204131.39387: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12755 1727204131.39548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204131.39552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204131.39619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204131.39687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204131.39697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204131.39776: variable 'ansible_python' from source: facts 12755 1727204131.39804: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12755 1727204131.39997: variable '__network_wpa_supplicant_required' from source: role '' defaults 12755 1727204131.40012: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12755 1727204131.40131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204131.40153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204131.40176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204131.40259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204131.40262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204131.40361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204131.40696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204131.40699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204131.40702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204131.40705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204131.41002: variable 'network_connections' from source: task vars 12755 1727204131.41016: variable 'port1_profile' from source: play vars 12755 1727204131.41192: variable 'port1_profile' from source: play vars 12755 1727204131.41370: variable 'port2_profile' from source: play vars 12755 1727204131.41456: variable 'port2_profile' from source: play vars 12755 1727204131.41581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204131.41739: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204131.41814: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204131.41880: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204131.41967: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204131.42100: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204131.42104: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204131.42115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204131.42161: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204131.42220: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204131.42623: variable 'network_connections' from source: task vars 12755 1727204131.42630: variable 'port1_profile' from source: play vars 12755 1727204131.42727: variable 'port1_profile' from source: play vars 12755 1727204131.42741: variable 'port2_profile' from source: play vars 12755 1727204131.42897: variable 'port2_profile' from source: play vars 12755 1727204131.42901: variable '__network_packages_default_wireless' from source: role '' defaults 12755 1727204131.42981: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204131.43400: variable 'network_connections' from source: task vars 12755 1727204131.43404: variable 'port1_profile' from source: play vars 12755 1727204131.43476: variable 'port1_profile' from source: play vars 12755 1727204131.43495: variable 'port2_profile' from source: play vars 12755 1727204131.43603: variable 'port2_profile' from source: play vars 12755 1727204131.43652: variable '__network_packages_default_team' from source: role '' defaults 12755 1727204131.43767: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204131.44201: variable 'network_connections' from source: task vars 12755 1727204131.44213: variable 'port1_profile' from source: play vars 12755 1727204131.44351: variable 'port1_profile' from source: play vars 12755 1727204131.44355: variable 'port2_profile' from source: play vars 12755 1727204131.44403: variable 'port2_profile' from source: play vars 12755 1727204131.44469: variable '__network_service_name_default_initscripts' from source: role '' defaults 12755 1727204131.44578: variable '__network_service_name_default_initscripts' from source: role '' defaults 12755 1727204131.44582: variable '__network_packages_default_initscripts' from source: role '' defaults 12755 1727204131.44716: variable '__network_packages_default_initscripts' from source: role '' defaults 12755 1727204131.45039: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12755 1727204131.45753: variable 'network_connections' from source: task vars 12755 1727204131.45765: variable 'port1_profile' from source: play vars 12755 1727204131.45855: variable 'port1_profile' from source: play vars 12755 1727204131.45894: variable 'port2_profile' from source: play vars 12755 1727204131.45958: variable 'port2_profile' from source: play vars 12755 1727204131.45970: variable 'ansible_distribution' from source: facts 12755 1727204131.45978: variable '__network_rh_distros' from source: role '' defaults 12755 1727204131.45987: variable 'ansible_distribution_major_version' from source: facts 12755 1727204131.46011: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12755 1727204131.46594: variable 'ansible_distribution' from source: facts 12755 1727204131.46598: variable '__network_rh_distros' from source: role '' defaults 12755 1727204131.46601: variable 'ansible_distribution_major_version' from source: facts 12755 1727204131.46603: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12755 1727204131.46830: variable 'ansible_distribution' from source: facts 12755 1727204131.46852: variable '__network_rh_distros' from source: role '' defaults 12755 1727204131.46863: variable 'ansible_distribution_major_version' from source: facts 12755 1727204131.46913: variable 'network_provider' from source: set_fact 12755 1727204131.46956: variable 'omit' from source: magic vars 12755 1727204131.47062: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204131.47066: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204131.47068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204131.47094: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204131.47114: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204131.47156: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204131.47172: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204131.47185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204131.47329: Set connection var ansible_connection to ssh 12755 1727204131.47387: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204131.47390: Set connection var ansible_shell_type to sh 12755 1727204131.47394: Set connection var ansible_timeout to 10 12755 1727204131.47397: Set connection var ansible_shell_executable to /bin/sh 12755 1727204131.47399: Set connection var ansible_pipelining to False 12755 1727204131.47495: variable 'ansible_shell_executable' from source: unknown 12755 1727204131.47498: variable 'ansible_connection' from source: unknown 12755 1727204131.47502: variable 'ansible_module_compression' from source: unknown 12755 1727204131.47504: variable 'ansible_shell_type' from source: unknown 12755 1727204131.47513: variable 'ansible_shell_executable' from source: unknown 12755 1727204131.47515: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204131.47517: variable 'ansible_pipelining' from source: unknown 12755 1727204131.47519: variable 'ansible_timeout' from source: unknown 12755 1727204131.47522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204131.47765: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204131.47768: variable 'omit' from source: magic vars 12755 1727204131.47771: starting attempt loop 12755 1727204131.47774: running the handler 12755 1727204131.47891: variable 'ansible_facts' from source: unknown 12755 1727204131.50381: _low_level_execute_command(): starting 12755 1727204131.50399: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204131.52024: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204131.52275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204131.52425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204131.54356: stdout chunk (state=3): >>>/root <<< 12755 1727204131.54437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204131.54501: stderr chunk (state=3): >>><<< 12755 1727204131.54695: stdout chunk (state=3): >>><<< 12755 1727204131.54699: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204131.54702: _low_level_execute_command(): starting 12755 1727204131.54704: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204131.5459304-15829-18022449183291 `" && echo ansible-tmp-1727204131.5459304-15829-18022449183291="` echo /root/.ansible/tmp/ansible-tmp-1727204131.5459304-15829-18022449183291 `" ) && sleep 0' 12755 1727204131.55407: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204131.55491: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204131.55537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204131.55561: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204131.55588: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204131.55652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204131.57796: stdout chunk (state=3): >>>ansible-tmp-1727204131.5459304-15829-18022449183291=/root/.ansible/tmp/ansible-tmp-1727204131.5459304-15829-18022449183291 <<< 12755 1727204131.58000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204131.58029: stdout chunk (state=3): >>><<< 12755 1727204131.58033: stderr chunk (state=3): >>><<< 12755 1727204131.58052: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204131.5459304-15829-18022449183291=/root/.ansible/tmp/ansible-tmp-1727204131.5459304-15829-18022449183291 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204131.58112: variable 'ansible_module_compression' from source: unknown 12755 1727204131.58195: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 12755 1727204131.58260: variable 'ansible_facts' from source: unknown 12755 1727204131.58495: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204131.5459304-15829-18022449183291/AnsiballZ_systemd.py 12755 1727204131.58729: Sending initial data 12755 1727204131.58742: Sent initial data (155 bytes) 12755 1727204131.59514: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204131.59546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204131.59567: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204131.59583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204131.59744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204131.61500: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204131.61539: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204131.61627: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpnfqpnz2b /root/.ansible/tmp/ansible-tmp-1727204131.5459304-15829-18022449183291/AnsiballZ_systemd.py <<< 12755 1727204131.61632: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204131.5459304-15829-18022449183291/AnsiballZ_systemd.py" <<< 12755 1727204131.61635: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 12755 1727204131.61680: stderr chunk (state=3): >>>debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpnfqpnz2b" to remote "/root/.ansible/tmp/ansible-tmp-1727204131.5459304-15829-18022449183291/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204131.5459304-15829-18022449183291/AnsiballZ_systemd.py" <<< 12755 1727204131.64698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204131.64703: stderr chunk (state=3): >>><<< 12755 1727204131.64706: stdout chunk (state=3): >>><<< 12755 1727204131.64708: done transferring module to remote 12755 1727204131.64740: _low_level_execute_command(): starting 12755 1727204131.64744: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204131.5459304-15829-18022449183291/ /root/.ansible/tmp/ansible-tmp-1727204131.5459304-15829-18022449183291/AnsiballZ_systemd.py && sleep 0' 12755 1727204131.65367: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204131.65377: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204131.65411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204131.65415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204131.65424: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204131.65495: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204131.65535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204131.65576: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204131.65617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204131.67702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204131.67733: stdout chunk (state=3): >>><<< 12755 1727204131.67737: stderr chunk (state=3): >>><<< 12755 1727204131.67757: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204131.67855: _low_level_execute_command(): starting 12755 1727204131.67860: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204131.5459304-15829-18022449183291/AnsiballZ_systemd.py && sleep 0' 12755 1727204131.68427: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204131.68505: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204131.68546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204131.68565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204131.68590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204131.68677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204132.03431: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "651", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ExecMainStartTimestampMonotonic": "17567139", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "651", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "12271616", "MemoryAvailable": "infinity", "CPUUsageNSec": "1141803000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "in<<< 12755 1727204132.03443: stdout chunk (state=3): >>>finity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service multi-user.target cloud-init.service NetworkManager-wait-online.service network.target shutdown.target", "After": "dbus-broker.service system.slice dbus.socket systemd-journald.socket basic.target sysinit.target network-pre.target cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:51 EDT", "StateChangeTimestampMonotonic": "521403753", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:27 EDT", "InactiveExitTimestampMonotonic": "17567399", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ActiveEnterTimestampMonotonic": "18019295", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ConditionTimestampMonotonic": "17554557", "AssertTimestamp": "Tue 2024-09-24 14:45:27 EDT", "AssertTimestampMonotonic": "17554559", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac0fd3fc06b14ac59a7d5e4a43cc5865", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 12755 1727204132.05828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204132.05866: stderr chunk (state=3): >>><<< 12755 1727204132.05879: stdout chunk (state=3): >>><<< 12755 1727204132.05912: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "651", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ExecMainStartTimestampMonotonic": "17567139", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "651", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "12271616", "MemoryAvailable": "infinity", "CPUUsageNSec": "1141803000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service multi-user.target cloud-init.service NetworkManager-wait-online.service network.target shutdown.target", "After": "dbus-broker.service system.slice dbus.socket systemd-journald.socket basic.target sysinit.target network-pre.target cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:51 EDT", "StateChangeTimestampMonotonic": "521403753", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:27 EDT", "InactiveExitTimestampMonotonic": "17567399", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ActiveEnterTimestampMonotonic": "18019295", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ConditionTimestampMonotonic": "17554557", "AssertTimestamp": "Tue 2024-09-24 14:45:27 EDT", "AssertTimestampMonotonic": "17554559", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac0fd3fc06b14ac59a7d5e4a43cc5865", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204132.06277: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204131.5459304-15829-18022449183291/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204132.06378: _low_level_execute_command(): starting 12755 1727204132.06382: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204131.5459304-15829-18022449183291/ > /dev/null 2>&1 && sleep 0' 12755 1727204132.07079: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204132.07171: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204132.07226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204132.07244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204132.07270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204132.07356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204132.09432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204132.09436: stderr chunk (state=3): >>><<< 12755 1727204132.09439: stdout chunk (state=3): >>><<< 12755 1727204132.09455: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204132.09469: handler run complete 12755 1727204132.09562: attempt loop complete, returning result 12755 1727204132.09624: _execute() done 12755 1727204132.09628: dumping result to json 12755 1727204132.09795: done dumping result, returning 12755 1727204132.09798: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-72e9-1a19-000000000127] 12755 1727204132.09801: sending task result for task 12b410aa-8751-72e9-1a19-000000000127 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204132.10375: no more pending results, returning what we have 12755 1727204132.10380: results queue empty 12755 1727204132.10381: checking for any_errors_fatal 12755 1727204132.10393: done checking for any_errors_fatal 12755 1727204132.10394: checking for max_fail_percentage 12755 1727204132.10397: done checking for max_fail_percentage 12755 1727204132.10398: checking to see if all hosts have failed and the running result is not ok 12755 1727204132.10399: done checking to see if all hosts have failed 12755 1727204132.10400: getting the remaining hosts for this loop 12755 1727204132.10402: done getting the remaining hosts for this loop 12755 1727204132.10408: getting the next task for host managed-node1 12755 1727204132.10419: done getting next task for host managed-node1 12755 1727204132.10423: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12755 1727204132.10426: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204132.10443: getting variables 12755 1727204132.10446: in VariableManager get_vars() 12755 1727204132.10716: Calling all_inventory to load vars for managed-node1 12755 1727204132.10720: Calling groups_inventory to load vars for managed-node1 12755 1727204132.10724: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204132.10736: Calling all_plugins_play to load vars for managed-node1 12755 1727204132.10740: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204132.10745: Calling groups_plugins_play to load vars for managed-node1 12755 1727204132.11306: done sending task result for task 12b410aa-8751-72e9-1a19-000000000127 12755 1727204132.11313: WORKER PROCESS EXITING 12755 1727204132.13215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204132.18368: done with get_vars() 12755 1727204132.18412: done getting variables 12755 1727204132.18488: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:55:32 -0400 (0:00:00.853) 0:00:57.424 ***** 12755 1727204132.18839: entering _queue_task() for managed-node1/service 12755 1727204132.19532: worker is 1 (out of 1 available) 12755 1727204132.19546: exiting _queue_task() for managed-node1/service 12755 1727204132.19559: done queuing things up, now waiting for results queue to drain 12755 1727204132.19560: waiting for pending results... 12755 1727204132.19944: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12755 1727204132.20149: in run() - task 12b410aa-8751-72e9-1a19-000000000128 12755 1727204132.20174: variable 'ansible_search_path' from source: unknown 12755 1727204132.20185: variable 'ansible_search_path' from source: unknown 12755 1727204132.20241: calling self._execute() 12755 1727204132.20370: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204132.20385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204132.20407: variable 'omit' from source: magic vars 12755 1727204132.20923: variable 'ansible_distribution_major_version' from source: facts 12755 1727204132.20946: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204132.21124: variable 'network_provider' from source: set_fact 12755 1727204132.21138: Evaluated conditional (network_provider == "nm"): True 12755 1727204132.21270: variable '__network_wpa_supplicant_required' from source: role '' defaults 12755 1727204132.21418: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12755 1727204132.21655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204132.24361: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204132.24474: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204132.24507: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204132.24550: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204132.24595: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204132.24815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204132.24908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204132.24911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204132.24955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204132.24978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204132.25053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204132.25125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204132.25128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204132.25191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204132.25216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204132.25283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204132.25319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204132.25364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204132.25450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204132.25454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204132.25649: variable 'network_connections' from source: task vars 12755 1727204132.25675: variable 'port1_profile' from source: play vars 12755 1727204132.25766: variable 'port1_profile' from source: play vars 12755 1727204132.25887: variable 'port2_profile' from source: play vars 12755 1727204132.25893: variable 'port2_profile' from source: play vars 12755 1727204132.25972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204132.26191: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204132.26254: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204132.26299: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204132.26354: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204132.26412: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204132.26657: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204132.26661: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204132.26664: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204132.26727: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204132.27452: variable 'network_connections' from source: task vars 12755 1727204132.27465: variable 'port1_profile' from source: play vars 12755 1727204132.27555: variable 'port1_profile' from source: play vars 12755 1727204132.27573: variable 'port2_profile' from source: play vars 12755 1727204132.27662: variable 'port2_profile' from source: play vars 12755 1727204132.27705: Evaluated conditional (__network_wpa_supplicant_required): False 12755 1727204132.27715: when evaluation is False, skipping this task 12755 1727204132.27732: _execute() done 12755 1727204132.27750: dumping result to json 12755 1727204132.27760: done dumping result, returning 12755 1727204132.27780: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-72e9-1a19-000000000128] 12755 1727204132.27795: sending task result for task 12b410aa-8751-72e9-1a19-000000000128 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 12755 1727204132.28051: no more pending results, returning what we have 12755 1727204132.28055: results queue empty 12755 1727204132.28057: checking for any_errors_fatal 12755 1727204132.28092: done checking for any_errors_fatal 12755 1727204132.28094: checking for max_fail_percentage 12755 1727204132.28096: done checking for max_fail_percentage 12755 1727204132.28097: checking to see if all hosts have failed and the running result is not ok 12755 1727204132.28098: done checking to see if all hosts have failed 12755 1727204132.28099: getting the remaining hosts for this loop 12755 1727204132.28182: done getting the remaining hosts for this loop 12755 1727204132.28188: getting the next task for host managed-node1 12755 1727204132.28200: done getting next task for host managed-node1 12755 1727204132.28206: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12755 1727204132.28210: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204132.28239: getting variables 12755 1727204132.28241: in VariableManager get_vars() 12755 1727204132.28424: Calling all_inventory to load vars for managed-node1 12755 1727204132.28428: Calling groups_inventory to load vars for managed-node1 12755 1727204132.28431: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204132.28441: done sending task result for task 12b410aa-8751-72e9-1a19-000000000128 12755 1727204132.28445: WORKER PROCESS EXITING 12755 1727204132.28456: Calling all_plugins_play to load vars for managed-node1 12755 1727204132.28460: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204132.28465: Calling groups_plugins_play to load vars for managed-node1 12755 1727204132.31002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204132.34402: done with get_vars() 12755 1727204132.34451: done getting variables 12755 1727204132.34529: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:55:32 -0400 (0:00:00.157) 0:00:57.581 ***** 12755 1727204132.34570: entering _queue_task() for managed-node1/service 12755 1727204132.35075: worker is 1 (out of 1 available) 12755 1727204132.35088: exiting _queue_task() for managed-node1/service 12755 1727204132.35103: done queuing things up, now waiting for results queue to drain 12755 1727204132.35105: waiting for pending results... 12755 1727204132.35337: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 12755 1727204132.35534: in run() - task 12b410aa-8751-72e9-1a19-000000000129 12755 1727204132.35564: variable 'ansible_search_path' from source: unknown 12755 1727204132.35575: variable 'ansible_search_path' from source: unknown 12755 1727204132.35631: calling self._execute() 12755 1727204132.35765: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204132.35784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204132.35804: variable 'omit' from source: magic vars 12755 1727204132.36287: variable 'ansible_distribution_major_version' from source: facts 12755 1727204132.36311: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204132.36540: variable 'network_provider' from source: set_fact 12755 1727204132.36544: Evaluated conditional (network_provider == "initscripts"): False 12755 1727204132.36547: when evaluation is False, skipping this task 12755 1727204132.36550: _execute() done 12755 1727204132.36552: dumping result to json 12755 1727204132.36555: done dumping result, returning 12755 1727204132.36558: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-72e9-1a19-000000000129] 12755 1727204132.36560: sending task result for task 12b410aa-8751-72e9-1a19-000000000129 skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204132.36851: no more pending results, returning what we have 12755 1727204132.36855: results queue empty 12755 1727204132.36857: checking for any_errors_fatal 12755 1727204132.36869: done checking for any_errors_fatal 12755 1727204132.36870: checking for max_fail_percentage 12755 1727204132.36873: done checking for max_fail_percentage 12755 1727204132.36874: checking to see if all hosts have failed and the running result is not ok 12755 1727204132.36876: done checking to see if all hosts have failed 12755 1727204132.36877: getting the remaining hosts for this loop 12755 1727204132.36878: done getting the remaining hosts for this loop 12755 1727204132.36884: getting the next task for host managed-node1 12755 1727204132.36895: done getting next task for host managed-node1 12755 1727204132.36900: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12755 1727204132.36905: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204132.36937: getting variables 12755 1727204132.36939: in VariableManager get_vars() 12755 1727204132.37124: Calling all_inventory to load vars for managed-node1 12755 1727204132.37128: Calling groups_inventory to load vars for managed-node1 12755 1727204132.37131: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204132.37139: done sending task result for task 12b410aa-8751-72e9-1a19-000000000129 12755 1727204132.37144: WORKER PROCESS EXITING 12755 1727204132.37155: Calling all_plugins_play to load vars for managed-node1 12755 1727204132.37159: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204132.37164: Calling groups_plugins_play to load vars for managed-node1 12755 1727204132.39643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204132.42601: done with get_vars() 12755 1727204132.42641: done getting variables 12755 1727204132.42719: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:55:32 -0400 (0:00:00.081) 0:00:57.663 ***** 12755 1727204132.42767: entering _queue_task() for managed-node1/copy 12755 1727204132.43331: worker is 1 (out of 1 available) 12755 1727204132.43344: exiting _queue_task() for managed-node1/copy 12755 1727204132.43357: done queuing things up, now waiting for results queue to drain 12755 1727204132.43359: waiting for pending results... 12755 1727204132.43539: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12755 1727204132.43732: in run() - task 12b410aa-8751-72e9-1a19-00000000012a 12755 1727204132.43756: variable 'ansible_search_path' from source: unknown 12755 1727204132.43766: variable 'ansible_search_path' from source: unknown 12755 1727204132.43823: calling self._execute() 12755 1727204132.43954: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204132.43969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204132.43988: variable 'omit' from source: magic vars 12755 1727204132.44476: variable 'ansible_distribution_major_version' from source: facts 12755 1727204132.44500: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204132.44657: variable 'network_provider' from source: set_fact 12755 1727204132.44670: Evaluated conditional (network_provider == "initscripts"): False 12755 1727204132.44688: when evaluation is False, skipping this task 12755 1727204132.44698: _execute() done 12755 1727204132.44707: dumping result to json 12755 1727204132.44716: done dumping result, returning 12755 1727204132.44730: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-72e9-1a19-00000000012a] 12755 1727204132.44744: sending task result for task 12b410aa-8751-72e9-1a19-00000000012a 12755 1727204132.44970: done sending task result for task 12b410aa-8751-72e9-1a19-00000000012a 12755 1727204132.44973: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12755 1727204132.45034: no more pending results, returning what we have 12755 1727204132.45038: results queue empty 12755 1727204132.45040: checking for any_errors_fatal 12755 1727204132.45047: done checking for any_errors_fatal 12755 1727204132.45049: checking for max_fail_percentage 12755 1727204132.45051: done checking for max_fail_percentage 12755 1727204132.45053: checking to see if all hosts have failed and the running result is not ok 12755 1727204132.45054: done checking to see if all hosts have failed 12755 1727204132.45055: getting the remaining hosts for this loop 12755 1727204132.45057: done getting the remaining hosts for this loop 12755 1727204132.45063: getting the next task for host managed-node1 12755 1727204132.45071: done getting next task for host managed-node1 12755 1727204132.45076: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12755 1727204132.45080: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204132.45109: getting variables 12755 1727204132.45112: in VariableManager get_vars() 12755 1727204132.45179: Calling all_inventory to load vars for managed-node1 12755 1727204132.45183: Calling groups_inventory to load vars for managed-node1 12755 1727204132.45187: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204132.45417: Calling all_plugins_play to load vars for managed-node1 12755 1727204132.45422: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204132.45427: Calling groups_plugins_play to load vars for managed-node1 12755 1727204132.47592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204132.50657: done with get_vars() 12755 1727204132.50695: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:55:32 -0400 (0:00:00.080) 0:00:57.743 ***** 12755 1727204132.50807: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 12755 1727204132.51280: worker is 1 (out of 1 available) 12755 1727204132.51298: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 12755 1727204132.51311: done queuing things up, now waiting for results queue to drain 12755 1727204132.51312: waiting for pending results... 12755 1727204132.51612: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12755 1727204132.51797: in run() - task 12b410aa-8751-72e9-1a19-00000000012b 12755 1727204132.51801: variable 'ansible_search_path' from source: unknown 12755 1727204132.51804: variable 'ansible_search_path' from source: unknown 12755 1727204132.51821: calling self._execute() 12755 1727204132.51945: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204132.51962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204132.51981: variable 'omit' from source: magic vars 12755 1727204132.52447: variable 'ansible_distribution_major_version' from source: facts 12755 1727204132.52473: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204132.52577: variable 'omit' from source: magic vars 12755 1727204132.52581: variable 'omit' from source: magic vars 12755 1727204132.52761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204132.68963: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204132.69067: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204132.69130: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204132.69208: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204132.69263: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204132.69411: variable 'network_provider' from source: set_fact 12755 1727204132.69580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204132.69639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204132.69799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204132.69861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204132.69885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204132.69986: variable 'omit' from source: magic vars 12755 1727204132.70144: variable 'omit' from source: magic vars 12755 1727204132.70453: variable 'network_connections' from source: task vars 12755 1727204132.70464: variable 'port1_profile' from source: play vars 12755 1727204132.70618: variable 'port1_profile' from source: play vars 12755 1727204132.70723: variable 'port2_profile' from source: play vars 12755 1727204132.70822: variable 'port2_profile' from source: play vars 12755 1727204132.71214: variable 'omit' from source: magic vars 12755 1727204132.71235: variable '__lsr_ansible_managed' from source: task vars 12755 1727204132.71558: variable '__lsr_ansible_managed' from source: task vars 12755 1727204132.71718: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 12755 1727204132.72008: Loaded config def from plugin (lookup/template) 12755 1727204132.72024: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 12755 1727204132.72059: File lookup term: get_ansible_managed.j2 12755 1727204132.72068: variable 'ansible_search_path' from source: unknown 12755 1727204132.72079: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 12755 1727204132.72104: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 12755 1727204132.72130: variable 'ansible_search_path' from source: unknown 12755 1727204132.89921: variable 'ansible_managed' from source: unknown 12755 1727204132.90448: variable 'omit' from source: magic vars 12755 1727204132.90541: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204132.90612: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204132.90633: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204132.90698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204132.90832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204132.90836: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204132.90839: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204132.90842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204132.91157: Set connection var ansible_connection to ssh 12755 1727204132.91161: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204132.91163: Set connection var ansible_shell_type to sh 12755 1727204132.91170: Set connection var ansible_timeout to 10 12755 1727204132.91183: Set connection var ansible_shell_executable to /bin/sh 12755 1727204132.91197: Set connection var ansible_pipelining to False 12755 1727204132.91253: variable 'ansible_shell_executable' from source: unknown 12755 1727204132.91375: variable 'ansible_connection' from source: unknown 12755 1727204132.91378: variable 'ansible_module_compression' from source: unknown 12755 1727204132.91382: variable 'ansible_shell_type' from source: unknown 12755 1727204132.91385: variable 'ansible_shell_executable' from source: unknown 12755 1727204132.91387: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204132.91391: variable 'ansible_pipelining' from source: unknown 12755 1727204132.91393: variable 'ansible_timeout' from source: unknown 12755 1727204132.91395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204132.91717: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204132.91795: variable 'omit' from source: magic vars 12755 1727204132.91798: starting attempt loop 12755 1727204132.91806: running the handler 12755 1727204132.92026: _low_level_execute_command(): starting 12755 1727204132.92029: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204132.93612: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 12755 1727204132.93637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204132.93665: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204132.93825: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204132.95743: stdout chunk (state=3): >>>/root <<< 12755 1727204132.96018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204132.96032: stdout chunk (state=3): >>><<< 12755 1727204132.96051: stderr chunk (state=3): >>><<< 12755 1727204132.96171: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204132.96195: _low_level_execute_command(): starting 12755 1727204132.96208: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204132.961791-15885-37302365443324 `" && echo ansible-tmp-1727204132.961791-15885-37302365443324="` echo /root/.ansible/tmp/ansible-tmp-1727204132.961791-15885-37302365443324 `" ) && sleep 0' 12755 1727204132.97542: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204132.97557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 12755 1727204132.97571: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204132.97784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204132.97788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204132.97913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204132.98086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204133.00208: stdout chunk (state=3): >>>ansible-tmp-1727204132.961791-15885-37302365443324=/root/.ansible/tmp/ansible-tmp-1727204132.961791-15885-37302365443324 <<< 12755 1727204133.00367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204133.00391: stderr chunk (state=3): >>><<< 12755 1727204133.00394: stdout chunk (state=3): >>><<< 12755 1727204133.00409: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204132.961791-15885-37302365443324=/root/.ansible/tmp/ansible-tmp-1727204132.961791-15885-37302365443324 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204133.00699: variable 'ansible_module_compression' from source: unknown 12755 1727204133.00703: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 12755 1727204133.00705: variable 'ansible_facts' from source: unknown 12755 1727204133.01056: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204132.961791-15885-37302365443324/AnsiballZ_network_connections.py 12755 1727204133.01421: Sending initial data 12755 1727204133.01480: Sent initial data (166 bytes) 12755 1727204133.02391: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204133.02403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204133.02608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204133.02665: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204133.02677: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204133.02697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204133.02823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204133.04714: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12755 1727204133.04729: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 12755 1727204133.04745: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 12755 1727204133.04765: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204133.04829: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204133.04875: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmphnre70nb /root/.ansible/tmp/ansible-tmp-1727204132.961791-15885-37302365443324/AnsiballZ_network_connections.py <<< 12755 1727204133.04902: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204132.961791-15885-37302365443324/AnsiballZ_network_connections.py" <<< 12755 1727204133.04944: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmphnre70nb" to remote "/root/.ansible/tmp/ansible-tmp-1727204132.961791-15885-37302365443324/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204132.961791-15885-37302365443324/AnsiballZ_network_connections.py" <<< 12755 1727204133.07197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204133.07201: stderr chunk (state=3): >>><<< 12755 1727204133.07203: stdout chunk (state=3): >>><<< 12755 1727204133.07205: done transferring module to remote 12755 1727204133.07207: _low_level_execute_command(): starting 12755 1727204133.07210: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204132.961791-15885-37302365443324/ /root/.ansible/tmp/ansible-tmp-1727204132.961791-15885-37302365443324/AnsiballZ_network_connections.py && sleep 0' 12755 1727204133.08111: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204133.08127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204133.08155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204133.08207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204133.08296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204133.08317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204133.08334: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204133.08412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204133.10619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204133.10623: stdout chunk (state=3): >>><<< 12755 1727204133.10626: stderr chunk (state=3): >>><<< 12755 1727204133.10628: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204133.10630: _low_level_execute_command(): starting 12755 1727204133.10632: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204132.961791-15885-37302365443324/AnsiballZ_network_connections.py && sleep 0' 12755 1727204133.11329: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204133.11334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204133.11363: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 12755 1727204133.11370: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204133.11448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204133.11473: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204133.11565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204133.66018: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_st448547/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_st448547/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/b31f9f7b-aae6-41e7-b0f9-8f3978732a4c: error=unknown <<< 12755 1727204133.68216: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_st448547/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_st448547/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/c3d9ed9d-1394-45f3-85cd-4954ed7e4e38: error=unknown <<< 12755 1727204133.68284: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 12755 1727204133.70670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204133.70674: stdout chunk (state=3): >>><<< 12755 1727204133.70676: stderr chunk (state=3): >>><<< 12755 1727204133.70719: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_st448547/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_st448547/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/b31f9f7b-aae6-41e7-b0f9-8f3978732a4c: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_st448547/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_st448547/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/c3d9ed9d-1394-45f3-85cd-4954ed7e4e38: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204133.71097: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204132.961791-15885-37302365443324/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204133.71101: _low_level_execute_command(): starting 12755 1727204133.71104: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204132.961791-15885-37302365443324/ > /dev/null 2>&1 && sleep 0' 12755 1727204133.72867: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204133.72871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204133.72942: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204133.73031: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204133.73274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204133.75170: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204133.75231: stderr chunk (state=3): >>><<< 12755 1727204133.75387: stdout chunk (state=3): >>><<< 12755 1727204133.75394: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204133.75397: handler run complete 12755 1727204133.75499: attempt loop complete, returning result 12755 1727204133.75503: _execute() done 12755 1727204133.75506: dumping result to json 12755 1727204133.75812: done dumping result, returning 12755 1727204133.75820: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-72e9-1a19-00000000012b] 12755 1727204133.75826: sending task result for task 12b410aa-8751-72e9-1a19-00000000012b 12755 1727204133.75955: done sending task result for task 12b410aa-8751-72e9-1a19-00000000012b 12755 1727204133.75959: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0.1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 12755 1727204133.76129: no more pending results, returning what we have 12755 1727204133.76134: results queue empty 12755 1727204133.76135: checking for any_errors_fatal 12755 1727204133.76142: done checking for any_errors_fatal 12755 1727204133.76143: checking for max_fail_percentage 12755 1727204133.76145: done checking for max_fail_percentage 12755 1727204133.76146: checking to see if all hosts have failed and the running result is not ok 12755 1727204133.76147: done checking to see if all hosts have failed 12755 1727204133.76148: getting the remaining hosts for this loop 12755 1727204133.76150: done getting the remaining hosts for this loop 12755 1727204133.76155: getting the next task for host managed-node1 12755 1727204133.76162: done getting next task for host managed-node1 12755 1727204133.76166: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12755 1727204133.76170: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204133.76185: getting variables 12755 1727204133.76187: in VariableManager get_vars() 12755 1727204133.76666: Calling all_inventory to load vars for managed-node1 12755 1727204133.76670: Calling groups_inventory to load vars for managed-node1 12755 1727204133.76674: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204133.76686: Calling all_plugins_play to load vars for managed-node1 12755 1727204133.76692: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204133.76697: Calling groups_plugins_play to load vars for managed-node1 12755 1727204133.90849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204133.93708: done with get_vars() 12755 1727204133.93752: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:55:33 -0400 (0:00:01.430) 0:00:59.174 ***** 12755 1727204133.93847: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 12755 1727204133.94236: worker is 1 (out of 1 available) 12755 1727204133.94253: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 12755 1727204133.94268: done queuing things up, now waiting for results queue to drain 12755 1727204133.94270: waiting for pending results... 12755 1727204133.94711: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 12755 1727204133.94803: in run() - task 12b410aa-8751-72e9-1a19-00000000012c 12755 1727204133.94830: variable 'ansible_search_path' from source: unknown 12755 1727204133.94836: variable 'ansible_search_path' from source: unknown 12755 1727204133.94878: calling self._execute() 12755 1727204133.95046: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204133.95052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204133.95055: variable 'omit' from source: magic vars 12755 1727204133.95615: variable 'ansible_distribution_major_version' from source: facts 12755 1727204133.95620: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204133.95795: variable 'network_state' from source: role '' defaults 12755 1727204133.95798: Evaluated conditional (network_state != {}): False 12755 1727204133.95801: when evaluation is False, skipping this task 12755 1727204133.95803: _execute() done 12755 1727204133.95805: dumping result to json 12755 1727204133.95807: done dumping result, returning 12755 1727204133.95810: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-72e9-1a19-00000000012c] 12755 1727204133.95812: sending task result for task 12b410aa-8751-72e9-1a19-00000000012c 12755 1727204133.95882: done sending task result for task 12b410aa-8751-72e9-1a19-00000000012c 12755 1727204133.95885: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204133.95946: no more pending results, returning what we have 12755 1727204133.95951: results queue empty 12755 1727204133.95952: checking for any_errors_fatal 12755 1727204133.95968: done checking for any_errors_fatal 12755 1727204133.95969: checking for max_fail_percentage 12755 1727204133.95971: done checking for max_fail_percentage 12755 1727204133.95972: checking to see if all hosts have failed and the running result is not ok 12755 1727204133.95973: done checking to see if all hosts have failed 12755 1727204133.95974: getting the remaining hosts for this loop 12755 1727204133.95976: done getting the remaining hosts for this loop 12755 1727204133.95981: getting the next task for host managed-node1 12755 1727204133.95988: done getting next task for host managed-node1 12755 1727204133.95993: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12755 1727204133.95997: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204133.96018: getting variables 12755 1727204133.96020: in VariableManager get_vars() 12755 1727204133.96074: Calling all_inventory to load vars for managed-node1 12755 1727204133.96077: Calling groups_inventory to load vars for managed-node1 12755 1727204133.96080: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204133.96195: Calling all_plugins_play to load vars for managed-node1 12755 1727204133.96200: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204133.96204: Calling groups_plugins_play to load vars for managed-node1 12755 1727204133.99156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204134.02057: done with get_vars() 12755 1727204134.02104: done getting variables 12755 1727204134.02179: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:55:34 -0400 (0:00:00.083) 0:00:59.257 ***** 12755 1727204134.02225: entering _queue_task() for managed-node1/debug 12755 1727204134.02612: worker is 1 (out of 1 available) 12755 1727204134.02629: exiting _queue_task() for managed-node1/debug 12755 1727204134.02643: done queuing things up, now waiting for results queue to drain 12755 1727204134.02645: waiting for pending results... 12755 1727204134.03113: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12755 1727204134.03172: in run() - task 12b410aa-8751-72e9-1a19-00000000012d 12755 1727204134.03203: variable 'ansible_search_path' from source: unknown 12755 1727204134.03220: variable 'ansible_search_path' from source: unknown 12755 1727204134.03268: calling self._execute() 12755 1727204134.03399: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204134.03412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204134.03433: variable 'omit' from source: magic vars 12755 1727204134.03971: variable 'ansible_distribution_major_version' from source: facts 12755 1727204134.03975: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204134.03978: variable 'omit' from source: magic vars 12755 1727204134.04020: variable 'omit' from source: magic vars 12755 1727204134.04066: variable 'omit' from source: magic vars 12755 1727204134.04124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204134.04173: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204134.04211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204134.04241: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204134.04263: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204134.04309: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204134.04320: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204134.04329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204134.04466: Set connection var ansible_connection to ssh 12755 1727204134.04515: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204134.04519: Set connection var ansible_shell_type to sh 12755 1727204134.04521: Set connection var ansible_timeout to 10 12755 1727204134.04523: Set connection var ansible_shell_executable to /bin/sh 12755 1727204134.04525: Set connection var ansible_pipelining to False 12755 1727204134.04556: variable 'ansible_shell_executable' from source: unknown 12755 1727204134.04565: variable 'ansible_connection' from source: unknown 12755 1727204134.04574: variable 'ansible_module_compression' from source: unknown 12755 1727204134.04624: variable 'ansible_shell_type' from source: unknown 12755 1727204134.04628: variable 'ansible_shell_executable' from source: unknown 12755 1727204134.04630: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204134.04633: variable 'ansible_pipelining' from source: unknown 12755 1727204134.04635: variable 'ansible_timeout' from source: unknown 12755 1727204134.04638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204134.04811: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204134.04832: variable 'omit' from source: magic vars 12755 1727204134.04847: starting attempt loop 12755 1727204134.04856: running the handler 12755 1727204134.05061: variable '__network_connections_result' from source: set_fact 12755 1727204134.05101: handler run complete 12755 1727204134.05131: attempt loop complete, returning result 12755 1727204134.05140: _execute() done 12755 1727204134.05149: dumping result to json 12755 1727204134.05158: done dumping result, returning 12755 1727204134.05179: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-72e9-1a19-00000000012d] 12755 1727204134.05277: sending task result for task 12b410aa-8751-72e9-1a19-00000000012d 12755 1727204134.05356: done sending task result for task 12b410aa-8751-72e9-1a19-00000000012d 12755 1727204134.05359: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "" ] } 12755 1727204134.05444: no more pending results, returning what we have 12755 1727204134.05448: results queue empty 12755 1727204134.05449: checking for any_errors_fatal 12755 1727204134.05456: done checking for any_errors_fatal 12755 1727204134.05457: checking for max_fail_percentage 12755 1727204134.05459: done checking for max_fail_percentage 12755 1727204134.05460: checking to see if all hosts have failed and the running result is not ok 12755 1727204134.05462: done checking to see if all hosts have failed 12755 1727204134.05463: getting the remaining hosts for this loop 12755 1727204134.05464: done getting the remaining hosts for this loop 12755 1727204134.05470: getting the next task for host managed-node1 12755 1727204134.05478: done getting next task for host managed-node1 12755 1727204134.05483: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12755 1727204134.05487: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204134.05504: getting variables 12755 1727204134.05507: in VariableManager get_vars() 12755 1727204134.05575: Calling all_inventory to load vars for managed-node1 12755 1727204134.05579: Calling groups_inventory to load vars for managed-node1 12755 1727204134.05583: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204134.05798: Calling all_plugins_play to load vars for managed-node1 12755 1727204134.05803: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204134.05808: Calling groups_plugins_play to load vars for managed-node1 12755 1727204134.08018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204134.11048: done with get_vars() 12755 1727204134.11083: done getting variables 12755 1727204134.11159: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:55:34 -0400 (0:00:00.089) 0:00:59.347 ***** 12755 1727204134.11203: entering _queue_task() for managed-node1/debug 12755 1727204134.11587: worker is 1 (out of 1 available) 12755 1727204134.11605: exiting _queue_task() for managed-node1/debug 12755 1727204134.11619: done queuing things up, now waiting for results queue to drain 12755 1727204134.11620: waiting for pending results... 12755 1727204134.12013: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12755 1727204134.12124: in run() - task 12b410aa-8751-72e9-1a19-00000000012e 12755 1727204134.12149: variable 'ansible_search_path' from source: unknown 12755 1727204134.12159: variable 'ansible_search_path' from source: unknown 12755 1727204134.12208: calling self._execute() 12755 1727204134.12347: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204134.12362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204134.12380: variable 'omit' from source: magic vars 12755 1727204134.12851: variable 'ansible_distribution_major_version' from source: facts 12755 1727204134.12981: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204134.12985: variable 'omit' from source: magic vars 12755 1727204134.12988: variable 'omit' from source: magic vars 12755 1727204134.13025: variable 'omit' from source: magic vars 12755 1727204134.13075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204134.13126: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204134.13155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204134.13179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204134.13204: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204134.13247: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204134.13257: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204134.13267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204134.13393: Set connection var ansible_connection to ssh 12755 1727204134.13406: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204134.13418: Set connection var ansible_shell_type to sh 12755 1727204134.13436: Set connection var ansible_timeout to 10 12755 1727204134.13448: Set connection var ansible_shell_executable to /bin/sh 12755 1727204134.13459: Set connection var ansible_pipelining to False 12755 1727204134.13490: variable 'ansible_shell_executable' from source: unknown 12755 1727204134.13500: variable 'ansible_connection' from source: unknown 12755 1727204134.13523: variable 'ansible_module_compression' from source: unknown 12755 1727204134.13526: variable 'ansible_shell_type' from source: unknown 12755 1727204134.13529: variable 'ansible_shell_executable' from source: unknown 12755 1727204134.13531: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204134.13632: variable 'ansible_pipelining' from source: unknown 12755 1727204134.13636: variable 'ansible_timeout' from source: unknown 12755 1727204134.13639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204134.13732: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204134.13758: variable 'omit' from source: magic vars 12755 1727204134.13770: starting attempt loop 12755 1727204134.13777: running the handler 12755 1727204134.13837: variable '__network_connections_result' from source: set_fact 12755 1727204134.13938: variable '__network_connections_result' from source: set_fact 12755 1727204134.14109: handler run complete 12755 1727204134.14150: attempt loop complete, returning result 12755 1727204134.14158: _execute() done 12755 1727204134.14165: dumping result to json 12755 1727204134.14177: done dumping result, returning 12755 1727204134.14192: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-72e9-1a19-00000000012e] 12755 1727204134.14202: sending task result for task 12b410aa-8751-72e9-1a19-00000000012e ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0.1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 12755 1727204134.14558: no more pending results, returning what we have 12755 1727204134.14563: results queue empty 12755 1727204134.14564: checking for any_errors_fatal 12755 1727204134.14571: done checking for any_errors_fatal 12755 1727204134.14572: checking for max_fail_percentage 12755 1727204134.14574: done checking for max_fail_percentage 12755 1727204134.14575: checking to see if all hosts have failed and the running result is not ok 12755 1727204134.14576: done checking to see if all hosts have failed 12755 1727204134.14578: getting the remaining hosts for this loop 12755 1727204134.14579: done getting the remaining hosts for this loop 12755 1727204134.14585: getting the next task for host managed-node1 12755 1727204134.14796: done getting next task for host managed-node1 12755 1727204134.14802: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12755 1727204134.14806: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204134.14820: getting variables 12755 1727204134.14822: in VariableManager get_vars() 12755 1727204134.14880: Calling all_inventory to load vars for managed-node1 12755 1727204134.14884: Calling groups_inventory to load vars for managed-node1 12755 1727204134.14887: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204134.14901: Calling all_plugins_play to load vars for managed-node1 12755 1727204134.14905: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204134.14909: Calling groups_plugins_play to load vars for managed-node1 12755 1727204134.15507: done sending task result for task 12b410aa-8751-72e9-1a19-00000000012e 12755 1727204134.15511: WORKER PROCESS EXITING 12755 1727204134.17113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204134.20015: done with get_vars() 12755 1727204134.20054: done getting variables 12755 1727204134.20122: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:55:34 -0400 (0:00:00.089) 0:00:59.437 ***** 12755 1727204134.20167: entering _queue_task() for managed-node1/debug 12755 1727204134.20530: worker is 1 (out of 1 available) 12755 1727204134.20543: exiting _queue_task() for managed-node1/debug 12755 1727204134.20558: done queuing things up, now waiting for results queue to drain 12755 1727204134.20560: waiting for pending results... 12755 1727204134.20880: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12755 1727204134.21064: in run() - task 12b410aa-8751-72e9-1a19-00000000012f 12755 1727204134.21093: variable 'ansible_search_path' from source: unknown 12755 1727204134.21103: variable 'ansible_search_path' from source: unknown 12755 1727204134.21153: calling self._execute() 12755 1727204134.21283: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204134.21301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204134.21322: variable 'omit' from source: magic vars 12755 1727204134.21813: variable 'ansible_distribution_major_version' from source: facts 12755 1727204134.21835: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204134.22005: variable 'network_state' from source: role '' defaults 12755 1727204134.22023: Evaluated conditional (network_state != {}): False 12755 1727204134.22032: when evaluation is False, skipping this task 12755 1727204134.22040: _execute() done 12755 1727204134.22049: dumping result to json 12755 1727204134.22059: done dumping result, returning 12755 1727204134.22074: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-72e9-1a19-00000000012f] 12755 1727204134.22086: sending task result for task 12b410aa-8751-72e9-1a19-00000000012f skipping: [managed-node1] => { "false_condition": "network_state != {}" } 12755 1727204134.22262: no more pending results, returning what we have 12755 1727204134.22267: results queue empty 12755 1727204134.22269: checking for any_errors_fatal 12755 1727204134.22280: done checking for any_errors_fatal 12755 1727204134.22281: checking for max_fail_percentage 12755 1727204134.22283: done checking for max_fail_percentage 12755 1727204134.22285: checking to see if all hosts have failed and the running result is not ok 12755 1727204134.22286: done checking to see if all hosts have failed 12755 1727204134.22287: getting the remaining hosts for this loop 12755 1727204134.22291: done getting the remaining hosts for this loop 12755 1727204134.22296: getting the next task for host managed-node1 12755 1727204134.22306: done getting next task for host managed-node1 12755 1727204134.22310: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12755 1727204134.22315: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204134.22343: getting variables 12755 1727204134.22346: in VariableManager get_vars() 12755 1727204134.22716: Calling all_inventory to load vars for managed-node1 12755 1727204134.22720: Calling groups_inventory to load vars for managed-node1 12755 1727204134.22723: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204134.22735: Calling all_plugins_play to load vars for managed-node1 12755 1727204134.22739: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204134.22744: Calling groups_plugins_play to load vars for managed-node1 12755 1727204134.23408: done sending task result for task 12b410aa-8751-72e9-1a19-00000000012f 12755 1727204134.23412: WORKER PROCESS EXITING 12755 1727204134.25428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204134.30178: done with get_vars() 12755 1727204134.30232: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:55:34 -0400 (0:00:00.102) 0:00:59.539 ***** 12755 1727204134.30387: entering _queue_task() for managed-node1/ping 12755 1727204134.30981: worker is 1 (out of 1 available) 12755 1727204134.31000: exiting _queue_task() for managed-node1/ping 12755 1727204134.31014: done queuing things up, now waiting for results queue to drain 12755 1727204134.31016: waiting for pending results... 12755 1727204134.31677: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 12755 1727204134.31867: in run() - task 12b410aa-8751-72e9-1a19-000000000130 12755 1727204134.31897: variable 'ansible_search_path' from source: unknown 12755 1727204134.31908: variable 'ansible_search_path' from source: unknown 12755 1727204134.31961: calling self._execute() 12755 1727204134.32142: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204134.32158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204134.32177: variable 'omit' from source: magic vars 12755 1727204134.32666: variable 'ansible_distribution_major_version' from source: facts 12755 1727204134.32688: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204134.32706: variable 'omit' from source: magic vars 12755 1727204134.32797: variable 'omit' from source: magic vars 12755 1727204134.32854: variable 'omit' from source: magic vars 12755 1727204134.32914: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204134.32968: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204134.33039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204134.33072: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204134.33097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204134.33139: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204134.33151: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204134.33161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204134.33464: Set connection var ansible_connection to ssh 12755 1727204134.33482: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204134.33495: Set connection var ansible_shell_type to sh 12755 1727204134.33517: Set connection var ansible_timeout to 10 12755 1727204134.33645: Set connection var ansible_shell_executable to /bin/sh 12755 1727204134.33658: Set connection var ansible_pipelining to False 12755 1727204134.33698: variable 'ansible_shell_executable' from source: unknown 12755 1727204134.33708: variable 'ansible_connection' from source: unknown 12755 1727204134.33718: variable 'ansible_module_compression' from source: unknown 12755 1727204134.33727: variable 'ansible_shell_type' from source: unknown 12755 1727204134.33737: variable 'ansible_shell_executable' from source: unknown 12755 1727204134.33749: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204134.33759: variable 'ansible_pipelining' from source: unknown 12755 1727204134.33768: variable 'ansible_timeout' from source: unknown 12755 1727204134.33778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204134.34042: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204134.34070: variable 'omit' from source: magic vars 12755 1727204134.34095: starting attempt loop 12755 1727204134.34099: running the handler 12755 1727204134.34271: _low_level_execute_command(): starting 12755 1727204134.34275: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204134.35259: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204134.35486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204134.35597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204134.35713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204134.37604: stdout chunk (state=3): >>>/root <<< 12755 1727204134.37834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204134.37849: stdout chunk (state=3): >>><<< 12755 1727204134.37871: stderr chunk (state=3): >>><<< 12755 1727204134.37905: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204134.37930: _low_level_execute_command(): starting 12755 1727204134.37946: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204134.3791354-16139-219105382153659 `" && echo ansible-tmp-1727204134.3791354-16139-219105382153659="` echo /root/.ansible/tmp/ansible-tmp-1727204134.3791354-16139-219105382153659 `" ) && sleep 0' 12755 1727204134.38629: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204134.38655: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204134.38672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204134.38696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204134.38717: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204134.38767: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204134.38870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204134.38914: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204134.38981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204134.41177: stdout chunk (state=3): >>>ansible-tmp-1727204134.3791354-16139-219105382153659=/root/.ansible/tmp/ansible-tmp-1727204134.3791354-16139-219105382153659 <<< 12755 1727204134.41292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204134.41405: stderr chunk (state=3): >>><<< 12755 1727204134.41408: stdout chunk (state=3): >>><<< 12755 1727204134.41414: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204134.3791354-16139-219105382153659=/root/.ansible/tmp/ansible-tmp-1727204134.3791354-16139-219105382153659 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204134.41461: variable 'ansible_module_compression' from source: unknown 12755 1727204134.41696: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 12755 1727204134.41700: variable 'ansible_facts' from source: unknown 12755 1727204134.41703: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204134.3791354-16139-219105382153659/AnsiballZ_ping.py 12755 1727204134.42395: Sending initial data 12755 1727204134.42399: Sent initial data (153 bytes) 12755 1727204134.43218: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204134.43228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204134.43235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204134.43254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204134.43261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12755 1727204134.43406: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204134.43607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204134.43683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204134.45439: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12755 1727204134.45447: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 12755 1727204134.45456: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 12755 1727204134.45482: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204134.45534: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204134.45588: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmph_3rxbzk /root/.ansible/tmp/ansible-tmp-1727204134.3791354-16139-219105382153659/AnsiballZ_ping.py <<< 12755 1727204134.45593: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204134.3791354-16139-219105382153659/AnsiballZ_ping.py" <<< 12755 1727204134.45647: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmph_3rxbzk" to remote "/root/.ansible/tmp/ansible-tmp-1727204134.3791354-16139-219105382153659/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204134.3791354-16139-219105382153659/AnsiballZ_ping.py" <<< 12755 1727204134.46701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204134.46843: stderr chunk (state=3): >>><<< 12755 1727204134.46847: stdout chunk (state=3): >>><<< 12755 1727204134.46849: done transferring module to remote 12755 1727204134.46851: _low_level_execute_command(): starting 12755 1727204134.46854: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204134.3791354-16139-219105382153659/ /root/.ansible/tmp/ansible-tmp-1727204134.3791354-16139-219105382153659/AnsiballZ_ping.py && sleep 0' 12755 1727204134.47700: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204134.47903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204134.48307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204134.48378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204134.50605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204134.50609: stdout chunk (state=3): >>><<< 12755 1727204134.50621: stderr chunk (state=3): >>><<< 12755 1727204134.50642: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204134.50645: _low_level_execute_command(): starting 12755 1727204134.50651: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204134.3791354-16139-219105382153659/AnsiballZ_ping.py && sleep 0' 12755 1727204134.52009: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204134.52043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204134.52227: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204134.52314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204134.70698: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 12755 1727204134.72499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204134.72503: stderr chunk (state=3): >>><<< 12755 1727204134.72506: stdout chunk (state=3): >>><<< 12755 1727204134.72508: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204134.72596: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204134.3791354-16139-219105382153659/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204134.72601: _low_level_execute_command(): starting 12755 1727204134.72603: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204134.3791354-16139-219105382153659/ > /dev/null 2>&1 && sleep 0' 12755 1727204134.74655: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204134.74660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204134.74997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204134.75317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204134.75385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204134.77456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204134.77896: stderr chunk (state=3): >>><<< 12755 1727204134.77900: stdout chunk (state=3): >>><<< 12755 1727204134.77924: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204134.77934: handler run complete 12755 1727204134.77961: attempt loop complete, returning result 12755 1727204134.77965: _execute() done 12755 1727204134.77968: dumping result to json 12755 1727204134.77973: done dumping result, returning 12755 1727204134.77993: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-72e9-1a19-000000000130] 12755 1727204134.77996: sending task result for task 12b410aa-8751-72e9-1a19-000000000130 12755 1727204134.78133: done sending task result for task 12b410aa-8751-72e9-1a19-000000000130 12755 1727204134.78137: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 12755 1727204134.78223: no more pending results, returning what we have 12755 1727204134.78227: results queue empty 12755 1727204134.78229: checking for any_errors_fatal 12755 1727204134.78237: done checking for any_errors_fatal 12755 1727204134.78238: checking for max_fail_percentage 12755 1727204134.78240: done checking for max_fail_percentage 12755 1727204134.78241: checking to see if all hosts have failed and the running result is not ok 12755 1727204134.78243: done checking to see if all hosts have failed 12755 1727204134.78244: getting the remaining hosts for this loop 12755 1727204134.78245: done getting the remaining hosts for this loop 12755 1727204134.78251: getting the next task for host managed-node1 12755 1727204134.78265: done getting next task for host managed-node1 12755 1727204134.78268: ^ task is: TASK: meta (role_complete) 12755 1727204134.78272: ^ state is: HOST STATE: block=2, task=30, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204134.78288: getting variables 12755 1727204134.78293: in VariableManager get_vars() 12755 1727204134.78367: Calling all_inventory to load vars for managed-node1 12755 1727204134.78372: Calling groups_inventory to load vars for managed-node1 12755 1727204134.78375: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204134.79195: Calling all_plugins_play to load vars for managed-node1 12755 1727204134.79203: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204134.79209: Calling groups_plugins_play to load vars for managed-node1 12755 1727204134.81820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204134.86148: done with get_vars() 12755 1727204134.86199: done getting variables 12755 1727204134.86416: done queuing things up, now waiting for results queue to drain 12755 1727204134.86419: results queue empty 12755 1727204134.86420: checking for any_errors_fatal 12755 1727204134.86424: done checking for any_errors_fatal 12755 1727204134.86426: checking for max_fail_percentage 12755 1727204134.86428: done checking for max_fail_percentage 12755 1727204134.86429: checking to see if all hosts have failed and the running result is not ok 12755 1727204134.86430: done checking to see if all hosts have failed 12755 1727204134.86431: getting the remaining hosts for this loop 12755 1727204134.86432: done getting the remaining hosts for this loop 12755 1727204134.86436: getting the next task for host managed-node1 12755 1727204134.86441: done getting next task for host managed-node1 12755 1727204134.86444: ^ task is: TASK: From the active connection, get the controller profile "{{ controller_profile }}" 12755 1727204134.86446: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204134.86449: getting variables 12755 1727204134.86450: in VariableManager get_vars() 12755 1727204134.86479: Calling all_inventory to load vars for managed-node1 12755 1727204134.86483: Calling groups_inventory to load vars for managed-node1 12755 1727204134.86485: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204134.86532: Calling all_plugins_play to load vars for managed-node1 12755 1727204134.86537: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204134.86541: Calling groups_plugins_play to load vars for managed-node1 12755 1727204134.90038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204134.96126: done with get_vars() 12755 1727204134.96172: done getting variables 12755 1727204134.96354: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204134.96515: variable 'controller_profile' from source: play vars TASK [From the active connection, get the controller profile "bond0"] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:200 Tuesday 24 September 2024 14:55:34 -0400 (0:00:00.661) 0:01:00.201 ***** 12755 1727204134.96549: entering _queue_task() for managed-node1/command 12755 1727204134.96922: worker is 1 (out of 1 available) 12755 1727204134.96937: exiting _queue_task() for managed-node1/command 12755 1727204134.96950: done queuing things up, now waiting for results queue to drain 12755 1727204134.96952: waiting for pending results... 12755 1727204134.97322: running TaskExecutor() for managed-node1/TASK: From the active connection, get the controller profile "bond0" 12755 1727204134.97397: in run() - task 12b410aa-8751-72e9-1a19-000000000160 12755 1727204134.97404: variable 'ansible_search_path' from source: unknown 12755 1727204134.97455: calling self._execute() 12755 1727204134.97635: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204134.97639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204134.97642: variable 'omit' from source: magic vars 12755 1727204134.98088: variable 'ansible_distribution_major_version' from source: facts 12755 1727204134.98114: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204134.98268: variable 'network_provider' from source: set_fact 12755 1727204134.98287: Evaluated conditional (network_provider == "nm"): True 12755 1727204134.98394: variable 'omit' from source: magic vars 12755 1727204134.98398: variable 'omit' from source: magic vars 12755 1727204134.98453: variable 'controller_profile' from source: play vars 12755 1727204134.98479: variable 'omit' from source: magic vars 12755 1727204134.98538: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204134.98584: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204134.98620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204134.98650: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204134.98669: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204134.98715: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204134.98725: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204134.98738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204134.98953: Set connection var ansible_connection to ssh 12755 1727204134.98957: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204134.98959: Set connection var ansible_shell_type to sh 12755 1727204134.98962: Set connection var ansible_timeout to 10 12755 1727204134.98964: Set connection var ansible_shell_executable to /bin/sh 12755 1727204134.98966: Set connection var ansible_pipelining to False 12755 1727204134.98969: variable 'ansible_shell_executable' from source: unknown 12755 1727204134.98971: variable 'ansible_connection' from source: unknown 12755 1727204134.98974: variable 'ansible_module_compression' from source: unknown 12755 1727204134.98982: variable 'ansible_shell_type' from source: unknown 12755 1727204134.98990: variable 'ansible_shell_executable' from source: unknown 12755 1727204134.99000: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204134.99012: variable 'ansible_pipelining' from source: unknown 12755 1727204134.99022: variable 'ansible_timeout' from source: unknown 12755 1727204134.99032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204134.99215: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204134.99237: variable 'omit' from source: magic vars 12755 1727204134.99247: starting attempt loop 12755 1727204134.99255: running the handler 12755 1727204134.99295: _low_level_execute_command(): starting 12755 1727204134.99298: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204135.00155: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204135.00173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204135.00235: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204135.00502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204135.00763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204135.02651: stdout chunk (state=3): >>>/root <<< 12755 1727204135.02932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204135.02936: stdout chunk (state=3): >>><<< 12755 1727204135.02939: stderr chunk (state=3): >>><<< 12755 1727204135.03042: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204135.03301: _low_level_execute_command(): starting 12755 1727204135.03305: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204135.030221-16213-256330627901762 `" && echo ansible-tmp-1727204135.030221-16213-256330627901762="` echo /root/.ansible/tmp/ansible-tmp-1727204135.030221-16213-256330627901762 `" ) && sleep 0' 12755 1727204135.04767: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204135.04874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204135.05011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204135.05094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204135.07224: stdout chunk (state=3): >>>ansible-tmp-1727204135.030221-16213-256330627901762=/root/.ansible/tmp/ansible-tmp-1727204135.030221-16213-256330627901762 <<< 12755 1727204135.07369: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204135.07445: stderr chunk (state=3): >>><<< 12755 1727204135.07448: stdout chunk (state=3): >>><<< 12755 1727204135.07484: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204135.030221-16213-256330627901762=/root/.ansible/tmp/ansible-tmp-1727204135.030221-16213-256330627901762 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204135.07523: variable 'ansible_module_compression' from source: unknown 12755 1727204135.07584: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12755 1727204135.07620: variable 'ansible_facts' from source: unknown 12755 1727204135.08127: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204135.030221-16213-256330627901762/AnsiballZ_command.py 12755 1727204135.08798: Sending initial data 12755 1727204135.08802: Sent initial data (155 bytes) 12755 1727204135.09696: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204135.09700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204135.09704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12755 1727204135.09707: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204135.09713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204135.09835: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204135.09838: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204135.09860: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204135.10121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204135.11817: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204135.11877: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204135.11919: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpr6hk_x3l /root/.ansible/tmp/ansible-tmp-1727204135.030221-16213-256330627901762/AnsiballZ_command.py <<< 12755 1727204135.11926: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204135.030221-16213-256330627901762/AnsiballZ_command.py" <<< 12755 1727204135.11980: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpr6hk_x3l" to remote "/root/.ansible/tmp/ansible-tmp-1727204135.030221-16213-256330627901762/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204135.030221-16213-256330627901762/AnsiballZ_command.py" <<< 12755 1727204135.14099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204135.14126: stderr chunk (state=3): >>><<< 12755 1727204135.14130: stdout chunk (state=3): >>><<< 12755 1727204135.14219: done transferring module to remote 12755 1727204135.14223: _low_level_execute_command(): starting 12755 1727204135.14226: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204135.030221-16213-256330627901762/ /root/.ansible/tmp/ansible-tmp-1727204135.030221-16213-256330627901762/AnsiballZ_command.py && sleep 0' 12755 1727204135.15603: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204135.15821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204135.15841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204135.15923: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204135.17974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204135.17978: stdout chunk (state=3): >>><<< 12755 1727204135.17986: stderr chunk (state=3): >>><<< 12755 1727204135.18008: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204135.18019: _low_level_execute_command(): starting 12755 1727204135.18022: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204135.030221-16213-256330627901762/AnsiballZ_command.py && sleep 0' 12755 1727204135.19397: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204135.19403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204135.19406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204135.19411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204135.19413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204135.19416: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204135.19418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204135.19421: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12755 1727204135.19423: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 12755 1727204135.19425: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12755 1727204135.19427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204135.19429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204135.19431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204135.19433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204135.19435: stderr chunk (state=3): >>>debug2: match found <<< 12755 1727204135.19437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204135.19439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204135.19451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204135.19465: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204135.19555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204135.39766: stdout chunk (state=3): >>> {"changed": true, "stdout": "connection.id: bond0\nconnection.uuid: 03d70ce0-ddeb-47d0-bf95-c6e32f95cb44\nconnection.stable-id: --\nconnection.type: bond\nconnection.interface-name: nm-bond\nconnection.autoconnect: yes\nconnection.autoconnect-priority: 0\nconnection.autoconnect-retries: -1 (default)\nconnection.multi-connect: 0 (default)\nconnection.auth-retries: -1\nconnection.timestamp: 1727204124\nconnection.permissions: --\nconnection.zone: --\nconnection.master: --\nconnection.slave-type: --\nconnection.autoconnect-slaves: -1 (default)\nconnection.secondaries: --\nconnection.gateway-ping-timeout: 0\nconnection.metered: unknown\nconnection.lldp: default\nconnection.mdns: -1 (default)\nconnection.llmnr: -1 (default)\nconnection.dns-over-tls: -1 (default)\nconnection.mptcp-flags: 0x0 (default)\nconnection.wait-device-timeout: -1\nconnection.wait-activation-delay: -1\nipv4.method: auto\nipv4.dns: --\nipv4.dns-search: --\nipv4.dns-options: --\nipv4.dns-priority: 0\nipv4.addresses: --\nipv4.gateway: --\nipv4.routes: --\nipv4.route-metric: 65535\nipv4.route-table: 0 (unspec)\nipv4.routing-rules: --\nipv4.replace-local-rule: -1 (default)\nipv4.ignore-auto-routes: no\nipv4.ignore-auto-dns: no\nipv4.dhcp-client-id: --\nipv4.dhcp-iaid: --\nipv4.dhcp-timeout: 0 (default)\nipv4.dhcp-send-hostname: yes\nipv4.dhcp-hostname: --\nipv4.dhcp-fqdn: --\nipv4.dhcp-hostname-flags: 0x0 (none)\nipv4.never-default: no\nipv4.may-fail: yes\nipv4.required-timeout: -1 (default)\nipv4.dad-timeout: -1 (default)\nipv4.dhcp-vendor-class-identifier: --\nipv4.link-local: 0 (default)\nipv4.dhcp-reject-servers: --\nipv4.auto-route-ext-gw: -1 (default)\nipv6.method: auto\nipv6.dns: --\nipv6.dns-search: --\nipv6.dns-options: --\nipv6.dns-priority: 0\nipv6.addresses: --\nipv6.gateway: --\nipv6.routes: --\nipv6.route-metric: -1\nipv6.route-table: 0 (unspec)\nipv6.routing-rules: --\nipv6.replace-local-rule: -1 (default)\nipv6.ignore-auto-routes: no\nipv6.ignore-auto-dns: no\nipv6.never-default: no\nipv6.may-fail: yes\nipv6.required-timeout: -1 (default)\nipv6.ip6-privacy: -1 (unknown)\nipv6.addr-gen-mode: default\nipv6.ra-timeout: 0 (default)\nipv6.mtu: auto\nipv6.dhcp-pd-hint: --\nipv6.dhcp-duid: --\nipv6.dhcp-iaid: --\nipv6.dhcp-timeout: 0 (default)\nipv6.dhcp-send-hostname: yes\nipv6.dhcp-hostname: --\nipv6.dhcp-hostname-flags: 0x0 (none)\nipv6.auto-route-ext-gw: -1 (default)\nipv6.token: --\nbond.options: mode=active-backup,miimon=110\nproxy.method: none\nproxy.browser-only: no\nproxy.pac-url: --\nproxy.pac-script: --\nGENERAL.NAME: bond0\nGENERAL.UUID: 03d70ce0-ddeb-47d0-bf95-c6e32f95cb44\nGENERAL.DEVICES: nm-bond\nGENERAL.IP-IFACE: nm-bond\nGENERAL.STATE: activated\nGENERAL.DEFAULT: no\nGENERAL.DEFAULT6: yes\nGENERAL.SPEC-OBJECT: --\nGENERAL.VPN: no\nGENERAL.DBUS-PATH: /org/freedesktop/NetworkManager/ActiveConnection/22\nGENERAL.CON-PATH: /org/freedesktop/NetworkManager/Settings/18\nGENERAL.ZONE: --\nGENERAL.MASTER-PATH: --\nIP4.ADDRESS[1]: 192.0.2.217/24\nIP4.GATEWAY: 192.0.2.1\nIP4.ROUTE[1]: dst = 0.0.0.0/0, nh = 192.0.2.1, mt = 65535\nIP4.ROUTE[2]: dst = 192.0.2.0/24, nh = 0.0.0.0, mt = 65535\nIP4.DNS[1]: 192.0.2.1\nDHCP4.OPTION[1]: broadcast_address = 192.0.2.255\nDHCP4.OPTION[2]: dhcp_client_identifier = 01:e2:27:4f:b7:c2:61\nDHCP4.OPTION[3]: dhcp_lease_time = 240\nDHCP4.OPTION[4]: dhcp_server_identifier = 192.0.2.1\nDHCP4.OPTION[5]: domain_name_servers = 192.0.2.1\nDHCP4.OPTION[6]: expiry = 1727204364\nDHCP4.OPTION[7]: host_name = managed-node1\nDHCP4.OPTION[8]: ip_address = 192.0.2.217\nDHCP4.OPTION[9]: next_server = 192.0.2.1\nDHCP4.OPTION[10]: requested_broadcast_address = 1\nDHCP4.OPTION[11]: requested_domain_name = 1\nDHCP4.OPTION[12]: requested_domain_name_servers = 1\nDHCP4.OPTION[13]: requested_domain_search = 1\nDHCP4.OPTION[14]: requested_host_name = 1\nDHCP4.OPTION[15]: requested_interface_mtu = 1\nDHCP4.OPTION[16]: requested_ms_classless_static_routes = 1\nDHCP4.OPTION[17]: requested_nis_domain = 1\nDHCP4.OPTION[18]: requested_nis_servers = 1\nDHCP4.OPTION[19]: requested_ntp_servers = 1\nDHCP4.OPTION[20]: requested_rfc3442_classless_static_routes = 1\nDHCP4.OPTION[21]: requested_root_path = 1\nDHCP4.OPTION[22]: requested_routers = 1\nDHCP4.OPTION[23]: requested_static_routes = 1\nDHCP4.OPTION[24]: requested_subnet_mask = 1\nDHCP4.OPTION[25]: requested_time_offset = 1\nDHCP4.OPTION[26]: requested_wpad = 1\nDHCP4.OPTION[27]: routers = 192.0.2.1\nDHCP4.OPTION[28]: subnet_mask = 255.255.255.0\nIP6.ADDRESS[1]: 2001:db8::11b/128\nIP6.ADDRESS[2]: 2001:db8::a19:91df:58d7:1f0b/64\nIP6.ADDRESS[3]: fe80::654f:259:5735:c485/64\nIP6.GATEWAY: fe80::d81a:4aff:fecf:c797\nIP6.ROUTE[1]: dst = 2001:db8::11b/128, nh = ::, mt = 300\nIP6.ROUTE[2]: dst = 2001:db8::/64, nh = ::, mt = 300\nIP6.ROUTE[3]: dst = fe80::/64, nh = ::, mt = 1024\nIP6.ROUTE[4]: dst = ::/0, nh = fe80::d81a:4aff:fecf:c797, mt = 300\nIP6.DNS[1]: 2001:db8::6405:89ff:fed6:1c79\nIP6.DNS[2]: fe80::d81a:4aff:fecf:c797\nDHCP6.OPTION[1]: dhcp6_client_id = 00:04:91:3e:1f:a3:b7:ab:55:f6:9c:6b:d5:1d:7a:42:35:a7\nDHCP6.OPTION[2]: <<< 12755 1727204135.39794: stdout chunk (state=3): >>> dhcp6_name_servers = 2001:db8::6405:89ff:fed6:1c79\nDHCP6.OPTION[3]: fqdn_fqdn = managed-node1\nDHCP6.OPTION[4]: iaid = 8c:3b:13:c0\nDHCP6.OPTION[5]: ip6_address = 2001:db8::11b", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0"], "start": "2024-09-24 14:55:35.374937", "end": "2024-09-24 14:55:35.396303", "delta": "0:00:00.021366", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12755 1727204135.41840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204135.41848: stdout chunk (state=3): >>><<< 12755 1727204135.41857: stderr chunk (state=3): >>><<< 12755 1727204135.41976: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "connection.id: bond0\nconnection.uuid: 03d70ce0-ddeb-47d0-bf95-c6e32f95cb44\nconnection.stable-id: --\nconnection.type: bond\nconnection.interface-name: nm-bond\nconnection.autoconnect: yes\nconnection.autoconnect-priority: 0\nconnection.autoconnect-retries: -1 (default)\nconnection.multi-connect: 0 (default)\nconnection.auth-retries: -1\nconnection.timestamp: 1727204124\nconnection.permissions: --\nconnection.zone: --\nconnection.master: --\nconnection.slave-type: --\nconnection.autoconnect-slaves: -1 (default)\nconnection.secondaries: --\nconnection.gateway-ping-timeout: 0\nconnection.metered: unknown\nconnection.lldp: default\nconnection.mdns: -1 (default)\nconnection.llmnr: -1 (default)\nconnection.dns-over-tls: -1 (default)\nconnection.mptcp-flags: 0x0 (default)\nconnection.wait-device-timeout: -1\nconnection.wait-activation-delay: -1\nipv4.method: auto\nipv4.dns: --\nipv4.dns-search: --\nipv4.dns-options: --\nipv4.dns-priority: 0\nipv4.addresses: --\nipv4.gateway: --\nipv4.routes: --\nipv4.route-metric: 65535\nipv4.route-table: 0 (unspec)\nipv4.routing-rules: --\nipv4.replace-local-rule: -1 (default)\nipv4.ignore-auto-routes: no\nipv4.ignore-auto-dns: no\nipv4.dhcp-client-id: --\nipv4.dhcp-iaid: --\nipv4.dhcp-timeout: 0 (default)\nipv4.dhcp-send-hostname: yes\nipv4.dhcp-hostname: --\nipv4.dhcp-fqdn: --\nipv4.dhcp-hostname-flags: 0x0 (none)\nipv4.never-default: no\nipv4.may-fail: yes\nipv4.required-timeout: -1 (default)\nipv4.dad-timeout: -1 (default)\nipv4.dhcp-vendor-class-identifier: --\nipv4.link-local: 0 (default)\nipv4.dhcp-reject-servers: --\nipv4.auto-route-ext-gw: -1 (default)\nipv6.method: auto\nipv6.dns: --\nipv6.dns-search: --\nipv6.dns-options: --\nipv6.dns-priority: 0\nipv6.addresses: --\nipv6.gateway: --\nipv6.routes: --\nipv6.route-metric: -1\nipv6.route-table: 0 (unspec)\nipv6.routing-rules: --\nipv6.replace-local-rule: -1 (default)\nipv6.ignore-auto-routes: no\nipv6.ignore-auto-dns: no\nipv6.never-default: no\nipv6.may-fail: yes\nipv6.required-timeout: -1 (default)\nipv6.ip6-privacy: -1 (unknown)\nipv6.addr-gen-mode: default\nipv6.ra-timeout: 0 (default)\nipv6.mtu: auto\nipv6.dhcp-pd-hint: --\nipv6.dhcp-duid: --\nipv6.dhcp-iaid: --\nipv6.dhcp-timeout: 0 (default)\nipv6.dhcp-send-hostname: yes\nipv6.dhcp-hostname: --\nipv6.dhcp-hostname-flags: 0x0 (none)\nipv6.auto-route-ext-gw: -1 (default)\nipv6.token: --\nbond.options: mode=active-backup,miimon=110\nproxy.method: none\nproxy.browser-only: no\nproxy.pac-url: --\nproxy.pac-script: --\nGENERAL.NAME: bond0\nGENERAL.UUID: 03d70ce0-ddeb-47d0-bf95-c6e32f95cb44\nGENERAL.DEVICES: nm-bond\nGENERAL.IP-IFACE: nm-bond\nGENERAL.STATE: activated\nGENERAL.DEFAULT: no\nGENERAL.DEFAULT6: yes\nGENERAL.SPEC-OBJECT: --\nGENERAL.VPN: no\nGENERAL.DBUS-PATH: /org/freedesktop/NetworkManager/ActiveConnection/22\nGENERAL.CON-PATH: /org/freedesktop/NetworkManager/Settings/18\nGENERAL.ZONE: --\nGENERAL.MASTER-PATH: --\nIP4.ADDRESS[1]: 192.0.2.217/24\nIP4.GATEWAY: 192.0.2.1\nIP4.ROUTE[1]: dst = 0.0.0.0/0, nh = 192.0.2.1, mt = 65535\nIP4.ROUTE[2]: dst = 192.0.2.0/24, nh = 0.0.0.0, mt = 65535\nIP4.DNS[1]: 192.0.2.1\nDHCP4.OPTION[1]: broadcast_address = 192.0.2.255\nDHCP4.OPTION[2]: dhcp_client_identifier = 01:e2:27:4f:b7:c2:61\nDHCP4.OPTION[3]: dhcp_lease_time = 240\nDHCP4.OPTION[4]: dhcp_server_identifier = 192.0.2.1\nDHCP4.OPTION[5]: domain_name_servers = 192.0.2.1\nDHCP4.OPTION[6]: expiry = 1727204364\nDHCP4.OPTION[7]: host_name = managed-node1\nDHCP4.OPTION[8]: ip_address = 192.0.2.217\nDHCP4.OPTION[9]: next_server = 192.0.2.1\nDHCP4.OPTION[10]: requested_broadcast_address = 1\nDHCP4.OPTION[11]: requested_domain_name = 1\nDHCP4.OPTION[12]: requested_domain_name_servers = 1\nDHCP4.OPTION[13]: requested_domain_search = 1\nDHCP4.OPTION[14]: requested_host_name = 1\nDHCP4.OPTION[15]: requested_interface_mtu = 1\nDHCP4.OPTION[16]: requested_ms_classless_static_routes = 1\nDHCP4.OPTION[17]: requested_nis_domain = 1\nDHCP4.OPTION[18]: requested_nis_servers = 1\nDHCP4.OPTION[19]: requested_ntp_servers = 1\nDHCP4.OPTION[20]: requested_rfc3442_classless_static_routes = 1\nDHCP4.OPTION[21]: requested_root_path = 1\nDHCP4.OPTION[22]: requested_routers = 1\nDHCP4.OPTION[23]: requested_static_routes = 1\nDHCP4.OPTION[24]: requested_subnet_mask = 1\nDHCP4.OPTION[25]: requested_time_offset = 1\nDHCP4.OPTION[26]: requested_wpad = 1\nDHCP4.OPTION[27]: routers = 192.0.2.1\nDHCP4.OPTION[28]: subnet_mask = 255.255.255.0\nIP6.ADDRESS[1]: 2001:db8::11b/128\nIP6.ADDRESS[2]: 2001:db8::a19:91df:58d7:1f0b/64\nIP6.ADDRESS[3]: fe80::654f:259:5735:c485/64\nIP6.GATEWAY: fe80::d81a:4aff:fecf:c797\nIP6.ROUTE[1]: dst = 2001:db8::11b/128, nh = ::, mt = 300\nIP6.ROUTE[2]: dst = 2001:db8::/64, nh = ::, mt = 300\nIP6.ROUTE[3]: dst = fe80::/64, nh = ::, mt = 1024\nIP6.ROUTE[4]: dst = ::/0, nh = fe80::d81a:4aff:fecf:c797, mt = 300\nIP6.DNS[1]: 2001:db8::6405:89ff:fed6:1c79\nIP6.DNS[2]: fe80::d81a:4aff:fecf:c797\nDHCP6.OPTION[1]: dhcp6_client_id = 00:04:91:3e:1f:a3:b7:ab:55:f6:9c:6b:d5:1d:7a:42:35:a7\nDHCP6.OPTION[2]: dhcp6_name_servers = 2001:db8::6405:89ff:fed6:1c79\nDHCP6.OPTION[3]: fqdn_fqdn = managed-node1\nDHCP6.OPTION[4]: iaid = 8c:3b:13:c0\nDHCP6.OPTION[5]: ip6_address = 2001:db8::11b", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0"], "start": "2024-09-24 14:55:35.374937", "end": "2024-09-24 14:55:35.396303", "delta": "0:00:00.021366", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204135.42086: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli c show --active bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204135.030221-16213-256330627901762/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204135.42230: _low_level_execute_command(): starting 12755 1727204135.42236: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204135.030221-16213-256330627901762/ > /dev/null 2>&1 && sleep 0' 12755 1727204135.43566: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204135.43571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204135.43588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204135.43596: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204135.43805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204135.43871: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204135.43949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204135.46231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204135.46311: stderr chunk (state=3): >>><<< 12755 1727204135.46315: stdout chunk (state=3): >>><<< 12755 1727204135.46335: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204135.46346: handler run complete 12755 1727204135.46387: Evaluated conditional (False): False 12755 1727204135.46408: attempt loop complete, returning result 12755 1727204135.46415: _execute() done 12755 1727204135.46417: dumping result to json 12755 1727204135.46430: done dumping result, returning 12755 1727204135.46441: done running TaskExecutor() for managed-node1/TASK: From the active connection, get the controller profile "bond0" [12b410aa-8751-72e9-1a19-000000000160] 12755 1727204135.46447: sending task result for task 12b410aa-8751-72e9-1a19-000000000160 ok: [managed-node1] => { "changed": false, "cmd": [ "nmcli", "c", "show", "--active", "bond0" ], "delta": "0:00:00.021366", "end": "2024-09-24 14:55:35.396303", "rc": 0, "start": "2024-09-24 14:55:35.374937" } STDOUT: connection.id: bond0 connection.uuid: 03d70ce0-ddeb-47d0-bf95-c6e32f95cb44 connection.stable-id: -- connection.type: bond connection.interface-name: nm-bond connection.autoconnect: yes connection.autoconnect-priority: 0 connection.autoconnect-retries: -1 (default) connection.multi-connect: 0 (default) connection.auth-retries: -1 connection.timestamp: 1727204124 connection.permissions: -- connection.zone: -- connection.master: -- connection.slave-type: -- connection.autoconnect-slaves: -1 (default) connection.secondaries: -- connection.gateway-ping-timeout: 0 connection.metered: unknown connection.lldp: default connection.mdns: -1 (default) connection.llmnr: -1 (default) connection.dns-over-tls: -1 (default) connection.mptcp-flags: 0x0 (default) connection.wait-device-timeout: -1 connection.wait-activation-delay: -1 ipv4.method: auto ipv4.dns: -- ipv4.dns-search: -- ipv4.dns-options: -- ipv4.dns-priority: 0 ipv4.addresses: -- ipv4.gateway: -- ipv4.routes: -- ipv4.route-metric: 65535 ipv4.route-table: 0 (unspec) ipv4.routing-rules: -- ipv4.replace-local-rule: -1 (default) ipv4.ignore-auto-routes: no ipv4.ignore-auto-dns: no ipv4.dhcp-client-id: -- ipv4.dhcp-iaid: -- ipv4.dhcp-timeout: 0 (default) ipv4.dhcp-send-hostname: yes ipv4.dhcp-hostname: -- ipv4.dhcp-fqdn: -- ipv4.dhcp-hostname-flags: 0x0 (none) ipv4.never-default: no ipv4.may-fail: yes ipv4.required-timeout: -1 (default) ipv4.dad-timeout: -1 (default) ipv4.dhcp-vendor-class-identifier: -- ipv4.link-local: 0 (default) ipv4.dhcp-reject-servers: -- ipv4.auto-route-ext-gw: -1 (default) ipv6.method: auto ipv6.dns: -- ipv6.dns-search: -- ipv6.dns-options: -- ipv6.dns-priority: 0 ipv6.addresses: -- ipv6.gateway: -- ipv6.routes: -- ipv6.route-metric: -1 ipv6.route-table: 0 (unspec) ipv6.routing-rules: -- ipv6.replace-local-rule: -1 (default) ipv6.ignore-auto-routes: no ipv6.ignore-auto-dns: no ipv6.never-default: no ipv6.may-fail: yes ipv6.required-timeout: -1 (default) ipv6.ip6-privacy: -1 (unknown) ipv6.addr-gen-mode: default ipv6.ra-timeout: 0 (default) ipv6.mtu: auto ipv6.dhcp-pd-hint: -- ipv6.dhcp-duid: -- ipv6.dhcp-iaid: -- ipv6.dhcp-timeout: 0 (default) ipv6.dhcp-send-hostname: yes ipv6.dhcp-hostname: -- ipv6.dhcp-hostname-flags: 0x0 (none) ipv6.auto-route-ext-gw: -1 (default) ipv6.token: -- bond.options: mode=active-backup,miimon=110 proxy.method: none proxy.browser-only: no proxy.pac-url: -- proxy.pac-script: -- GENERAL.NAME: bond0 GENERAL.UUID: 03d70ce0-ddeb-47d0-bf95-c6e32f95cb44 GENERAL.DEVICES: nm-bond GENERAL.IP-IFACE: nm-bond GENERAL.STATE: activated GENERAL.DEFAULT: no GENERAL.DEFAULT6: yes GENERAL.SPEC-OBJECT: -- GENERAL.VPN: no GENERAL.DBUS-PATH: /org/freedesktop/NetworkManager/ActiveConnection/22 GENERAL.CON-PATH: /org/freedesktop/NetworkManager/Settings/18 GENERAL.ZONE: -- GENERAL.MASTER-PATH: -- IP4.ADDRESS[1]: 192.0.2.217/24 IP4.GATEWAY: 192.0.2.1 IP4.ROUTE[1]: dst = 0.0.0.0/0, nh = 192.0.2.1, mt = 65535 IP4.ROUTE[2]: dst = 192.0.2.0/24, nh = 0.0.0.0, mt = 65535 IP4.DNS[1]: 192.0.2.1 DHCP4.OPTION[1]: broadcast_address = 192.0.2.255 DHCP4.OPTION[2]: dhcp_client_identifier = 01:e2:27:4f:b7:c2:61 DHCP4.OPTION[3]: dhcp_lease_time = 240 DHCP4.OPTION[4]: dhcp_server_identifier = 192.0.2.1 DHCP4.OPTION[5]: domain_name_servers = 192.0.2.1 DHCP4.OPTION[6]: expiry = 1727204364 DHCP4.OPTION[7]: host_name = managed-node1 DHCP4.OPTION[8]: ip_address = 192.0.2.217 DHCP4.OPTION[9]: next_server = 192.0.2.1 DHCP4.OPTION[10]: requested_broadcast_address = 1 DHCP4.OPTION[11]: requested_domain_name = 1 DHCP4.OPTION[12]: requested_domain_name_servers = 1 DHCP4.OPTION[13]: requested_domain_search = 1 DHCP4.OPTION[14]: requested_host_name = 1 DHCP4.OPTION[15]: requested_interface_mtu = 1 DHCP4.OPTION[16]: requested_ms_classless_static_routes = 1 DHCP4.OPTION[17]: requested_nis_domain = 1 DHCP4.OPTION[18]: requested_nis_servers = 1 DHCP4.OPTION[19]: requested_ntp_servers = 1 DHCP4.OPTION[20]: requested_rfc3442_classless_static_routes = 1 DHCP4.OPTION[21]: requested_root_path = 1 DHCP4.OPTION[22]: requested_routers = 1 DHCP4.OPTION[23]: requested_static_routes = 1 DHCP4.OPTION[24]: requested_subnet_mask = 1 DHCP4.OPTION[25]: requested_time_offset = 1 DHCP4.OPTION[26]: requested_wpad = 1 DHCP4.OPTION[27]: routers = 192.0.2.1 DHCP4.OPTION[28]: subnet_mask = 255.255.255.0 IP6.ADDRESS[1]: 2001:db8::11b/128 IP6.ADDRESS[2]: 2001:db8::a19:91df:58d7:1f0b/64 IP6.ADDRESS[3]: fe80::654f:259:5735:c485/64 IP6.GATEWAY: fe80::d81a:4aff:fecf:c797 IP6.ROUTE[1]: dst = 2001:db8::11b/128, nh = ::, mt = 300 IP6.ROUTE[2]: dst = 2001:db8::/64, nh = ::, mt = 300 IP6.ROUTE[3]: dst = fe80::/64, nh = ::, mt = 1024 IP6.ROUTE[4]: dst = ::/0, nh = fe80::d81a:4aff:fecf:c797, mt = 300 IP6.DNS[1]: 2001:db8::6405:89ff:fed6:1c79 IP6.DNS[2]: fe80::d81a:4aff:fecf:c797 DHCP6.OPTION[1]: dhcp6_client_id = 00:04:91:3e:1f:a3:b7:ab:55:f6:9c:6b:d5:1d:7a:42:35:a7 DHCP6.OPTION[2]: dhcp6_name_servers = 2001:db8::6405:89ff:fed6:1c79 DHCP6.OPTION[3]: fqdn_fqdn = managed-node1 DHCP6.OPTION[4]: iaid = 8c:3b:13:c0 DHCP6.OPTION[5]: ip6_address = 2001:db8::11b 12755 1727204135.47179: no more pending results, returning what we have 12755 1727204135.47183: results queue empty 12755 1727204135.47185: checking for any_errors_fatal 12755 1727204135.47187: done checking for any_errors_fatal 12755 1727204135.47188: checking for max_fail_percentage 12755 1727204135.47307: done checking for max_fail_percentage 12755 1727204135.47308: checking to see if all hosts have failed and the running result is not ok 12755 1727204135.47311: done checking to see if all hosts have failed 12755 1727204135.47312: getting the remaining hosts for this loop 12755 1727204135.47314: done getting the remaining hosts for this loop 12755 1727204135.47319: getting the next task for host managed-node1 12755 1727204135.47325: done getting next task for host managed-node1 12755 1727204135.47328: ^ task is: TASK: Assert that the controller profile is activated 12755 1727204135.47330: ^ state is: HOST STATE: block=2, task=32, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204135.47333: getting variables 12755 1727204135.47335: in VariableManager get_vars() 12755 1727204135.47388: Calling all_inventory to load vars for managed-node1 12755 1727204135.47393: Calling groups_inventory to load vars for managed-node1 12755 1727204135.47396: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204135.47418: done sending task result for task 12b410aa-8751-72e9-1a19-000000000160 12755 1727204135.47422: WORKER PROCESS EXITING 12755 1727204135.47433: Calling all_plugins_play to load vars for managed-node1 12755 1727204135.47436: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204135.47440: Calling groups_plugins_play to load vars for managed-node1 12755 1727204135.52834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204135.59919: done with get_vars() 12755 1727204135.59969: done getting variables 12755 1727204135.60042: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the controller profile is activated] ************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:207 Tuesday 24 September 2024 14:55:35 -0400 (0:00:00.635) 0:01:00.836 ***** 12755 1727204135.60077: entering _queue_task() for managed-node1/assert 12755 1727204135.60866: worker is 1 (out of 1 available) 12755 1727204135.60882: exiting _queue_task() for managed-node1/assert 12755 1727204135.61098: done queuing things up, now waiting for results queue to drain 12755 1727204135.61100: waiting for pending results... 12755 1727204135.61394: running TaskExecutor() for managed-node1/TASK: Assert that the controller profile is activated 12755 1727204135.61692: in run() - task 12b410aa-8751-72e9-1a19-000000000161 12755 1727204135.61708: variable 'ansible_search_path' from source: unknown 12755 1727204135.61747: calling self._execute() 12755 1727204135.61871: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204135.61878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204135.61890: variable 'omit' from source: magic vars 12755 1727204135.62927: variable 'ansible_distribution_major_version' from source: facts 12755 1727204135.62940: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204135.63087: variable 'network_provider' from source: set_fact 12755 1727204135.63297: Evaluated conditional (network_provider == "nm"): True 12755 1727204135.63308: variable 'omit' from source: magic vars 12755 1727204135.63336: variable 'omit' from source: magic vars 12755 1727204135.63460: variable 'controller_profile' from source: play vars 12755 1727204135.63481: variable 'omit' from source: magic vars 12755 1727204135.63732: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204135.63773: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204135.63798: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204135.63834: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204135.63838: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204135.63870: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204135.63874: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204135.63876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204135.64698: Set connection var ansible_connection to ssh 12755 1727204135.64701: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204135.64704: Set connection var ansible_shell_type to sh 12755 1727204135.64706: Set connection var ansible_timeout to 10 12755 1727204135.64711: Set connection var ansible_shell_executable to /bin/sh 12755 1727204135.64714: Set connection var ansible_pipelining to False 12755 1727204135.64717: variable 'ansible_shell_executable' from source: unknown 12755 1727204135.64719: variable 'ansible_connection' from source: unknown 12755 1727204135.64722: variable 'ansible_module_compression' from source: unknown 12755 1727204135.64724: variable 'ansible_shell_type' from source: unknown 12755 1727204135.64726: variable 'ansible_shell_executable' from source: unknown 12755 1727204135.64729: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204135.64731: variable 'ansible_pipelining' from source: unknown 12755 1727204135.64734: variable 'ansible_timeout' from source: unknown 12755 1727204135.64736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204135.64739: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204135.64743: variable 'omit' from source: magic vars 12755 1727204135.64745: starting attempt loop 12755 1727204135.64748: running the handler 12755 1727204135.65185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204135.70843: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204135.70924: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204135.70979: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204135.71228: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204135.71258: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204135.71346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204135.71382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204135.71621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204135.71672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204135.71692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204135.72026: variable 'active_controller_profile' from source: set_fact 12755 1727204135.72063: Evaluated conditional (active_controller_profile.stdout | length != 0): True 12755 1727204135.72194: handler run complete 12755 1727204135.72198: attempt loop complete, returning result 12755 1727204135.72200: _execute() done 12755 1727204135.72203: dumping result to json 12755 1727204135.72205: done dumping result, returning 12755 1727204135.72208: done running TaskExecutor() for managed-node1/TASK: Assert that the controller profile is activated [12b410aa-8751-72e9-1a19-000000000161] 12755 1727204135.72213: sending task result for task 12b410aa-8751-72e9-1a19-000000000161 12755 1727204135.72292: done sending task result for task 12b410aa-8751-72e9-1a19-000000000161 12755 1727204135.72296: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12755 1727204135.72363: no more pending results, returning what we have 12755 1727204135.72367: results queue empty 12755 1727204135.72368: checking for any_errors_fatal 12755 1727204135.72384: done checking for any_errors_fatal 12755 1727204135.72385: checking for max_fail_percentage 12755 1727204135.72386: done checking for max_fail_percentage 12755 1727204135.72387: checking to see if all hosts have failed and the running result is not ok 12755 1727204135.72388: done checking to see if all hosts have failed 12755 1727204135.72392: getting the remaining hosts for this loop 12755 1727204135.72394: done getting the remaining hosts for this loop 12755 1727204135.72399: getting the next task for host managed-node1 12755 1727204135.72406: done getting next task for host managed-node1 12755 1727204135.72411: ^ task is: TASK: Get the controller device details 12755 1727204135.72414: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204135.72419: getting variables 12755 1727204135.72427: in VariableManager get_vars() 12755 1727204135.72488: Calling all_inventory to load vars for managed-node1 12755 1727204135.72617: Calling groups_inventory to load vars for managed-node1 12755 1727204135.72621: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204135.72635: Calling all_plugins_play to load vars for managed-node1 12755 1727204135.72639: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204135.72643: Calling groups_plugins_play to load vars for managed-node1 12755 1727204135.77723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204135.85254: done with get_vars() 12755 1727204135.85698: done getting variables 12755 1727204135.85777: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the controller device details] *************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:214 Tuesday 24 September 2024 14:55:35 -0400 (0:00:00.257) 0:01:01.093 ***** 12755 1727204135.85819: entering _queue_task() for managed-node1/command 12755 1727204135.86894: worker is 1 (out of 1 available) 12755 1727204135.86911: exiting _queue_task() for managed-node1/command 12755 1727204135.86925: done queuing things up, now waiting for results queue to drain 12755 1727204135.86927: waiting for pending results... 12755 1727204135.88092: running TaskExecutor() for managed-node1/TASK: Get the controller device details 12755 1727204135.88102: in run() - task 12b410aa-8751-72e9-1a19-000000000162 12755 1727204135.88106: variable 'ansible_search_path' from source: unknown 12755 1727204135.88112: calling self._execute() 12755 1727204135.88449: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204135.88454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204135.88457: variable 'omit' from source: magic vars 12755 1727204135.89411: variable 'ansible_distribution_major_version' from source: facts 12755 1727204135.89424: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204135.89700: variable 'network_provider' from source: set_fact 12755 1727204135.89895: Evaluated conditional (network_provider == "initscripts"): False 12755 1727204135.89899: when evaluation is False, skipping this task 12755 1727204135.89901: _execute() done 12755 1727204135.89904: dumping result to json 12755 1727204135.89906: done dumping result, returning 12755 1727204135.89908: done running TaskExecutor() for managed-node1/TASK: Get the controller device details [12b410aa-8751-72e9-1a19-000000000162] 12755 1727204135.89914: sending task result for task 12b410aa-8751-72e9-1a19-000000000162 12755 1727204135.89981: done sending task result for task 12b410aa-8751-72e9-1a19-000000000162 12755 1727204135.89985: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12755 1727204135.90050: no more pending results, returning what we have 12755 1727204135.90055: results queue empty 12755 1727204135.90056: checking for any_errors_fatal 12755 1727204135.90064: done checking for any_errors_fatal 12755 1727204135.90065: checking for max_fail_percentage 12755 1727204135.90067: done checking for max_fail_percentage 12755 1727204135.90068: checking to see if all hosts have failed and the running result is not ok 12755 1727204135.90069: done checking to see if all hosts have failed 12755 1727204135.90070: getting the remaining hosts for this loop 12755 1727204135.90072: done getting the remaining hosts for this loop 12755 1727204135.90077: getting the next task for host managed-node1 12755 1727204135.90085: done getting next task for host managed-node1 12755 1727204135.90091: ^ task is: TASK: Assert that the controller profile is activated 12755 1727204135.90094: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204135.90099: getting variables 12755 1727204135.90102: in VariableManager get_vars() 12755 1727204135.90171: Calling all_inventory to load vars for managed-node1 12755 1727204135.90175: Calling groups_inventory to load vars for managed-node1 12755 1727204135.90178: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204135.90336: Calling all_plugins_play to load vars for managed-node1 12755 1727204135.90342: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204135.90347: Calling groups_plugins_play to load vars for managed-node1 12755 1727204135.95091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204136.01725: done with get_vars() 12755 1727204136.01778: done getting variables 12755 1727204136.02019: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the controller profile is activated] ************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:221 Tuesday 24 September 2024 14:55:36 -0400 (0:00:00.162) 0:01:01.256 ***** 12755 1727204136.02105: entering _queue_task() for managed-node1/assert 12755 1727204136.03002: worker is 1 (out of 1 available) 12755 1727204136.03094: exiting _queue_task() for managed-node1/assert 12755 1727204136.03107: done queuing things up, now waiting for results queue to drain 12755 1727204136.03108: waiting for pending results... 12755 1727204136.04014: running TaskExecutor() for managed-node1/TASK: Assert that the controller profile is activated 12755 1727204136.04019: in run() - task 12b410aa-8751-72e9-1a19-000000000163 12755 1727204136.04023: variable 'ansible_search_path' from source: unknown 12755 1727204136.04027: calling self._execute() 12755 1727204136.04327: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204136.04332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204136.04335: variable 'omit' from source: magic vars 12755 1727204136.05224: variable 'ansible_distribution_major_version' from source: facts 12755 1727204136.05281: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204136.05551: variable 'network_provider' from source: set_fact 12755 1727204136.05560: Evaluated conditional (network_provider == "initscripts"): False 12755 1727204136.05563: when evaluation is False, skipping this task 12755 1727204136.05685: _execute() done 12755 1727204136.05690: dumping result to json 12755 1727204136.05694: done dumping result, returning 12755 1727204136.05703: done running TaskExecutor() for managed-node1/TASK: Assert that the controller profile is activated [12b410aa-8751-72e9-1a19-000000000163] 12755 1727204136.05713: sending task result for task 12b410aa-8751-72e9-1a19-000000000163 12755 1727204136.05828: done sending task result for task 12b410aa-8751-72e9-1a19-000000000163 12755 1727204136.05832: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12755 1727204136.05906: no more pending results, returning what we have 12755 1727204136.05913: results queue empty 12755 1727204136.05914: checking for any_errors_fatal 12755 1727204136.05921: done checking for any_errors_fatal 12755 1727204136.05922: checking for max_fail_percentage 12755 1727204136.05924: done checking for max_fail_percentage 12755 1727204136.05924: checking to see if all hosts have failed and the running result is not ok 12755 1727204136.05925: done checking to see if all hosts have failed 12755 1727204136.05926: getting the remaining hosts for this loop 12755 1727204136.05928: done getting the remaining hosts for this loop 12755 1727204136.05933: getting the next task for host managed-node1 12755 1727204136.05949: done getting next task for host managed-node1 12755 1727204136.05954: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12755 1727204136.05959: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204136.05987: getting variables 12755 1727204136.05991: in VariableManager get_vars() 12755 1727204136.06065: Calling all_inventory to load vars for managed-node1 12755 1727204136.06069: Calling groups_inventory to load vars for managed-node1 12755 1727204136.06072: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204136.06088: Calling all_plugins_play to load vars for managed-node1 12755 1727204136.06297: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204136.06302: Calling groups_plugins_play to load vars for managed-node1 12755 1727204136.11157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204136.16437: done with get_vars() 12755 1727204136.16496: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:55:36 -0400 (0:00:00.145) 0:01:01.402 ***** 12755 1727204136.16637: entering _queue_task() for managed-node1/include_tasks 12755 1727204136.17217: worker is 1 (out of 1 available) 12755 1727204136.17231: exiting _queue_task() for managed-node1/include_tasks 12755 1727204136.17244: done queuing things up, now waiting for results queue to drain 12755 1727204136.17245: waiting for pending results... 12755 1727204136.17442: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12755 1727204136.17729: in run() - task 12b410aa-8751-72e9-1a19-00000000016c 12755 1727204136.17734: variable 'ansible_search_path' from source: unknown 12755 1727204136.17737: variable 'ansible_search_path' from source: unknown 12755 1727204136.17774: calling self._execute() 12755 1727204136.17915: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204136.17971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204136.17975: variable 'omit' from source: magic vars 12755 1727204136.18465: variable 'ansible_distribution_major_version' from source: facts 12755 1727204136.18486: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204136.18502: _execute() done 12755 1727204136.18519: dumping result to json 12755 1727204136.18571: done dumping result, returning 12755 1727204136.18575: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-72e9-1a19-00000000016c] 12755 1727204136.18578: sending task result for task 12b410aa-8751-72e9-1a19-00000000016c 12755 1727204136.18853: no more pending results, returning what we have 12755 1727204136.18859: in VariableManager get_vars() 12755 1727204136.18940: Calling all_inventory to load vars for managed-node1 12755 1727204136.18945: Calling groups_inventory to load vars for managed-node1 12755 1727204136.18948: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204136.18964: Calling all_plugins_play to load vars for managed-node1 12755 1727204136.18969: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204136.18973: Calling groups_plugins_play to load vars for managed-node1 12755 1727204136.19506: done sending task result for task 12b410aa-8751-72e9-1a19-00000000016c 12755 1727204136.19511: WORKER PROCESS EXITING 12755 1727204136.21616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204136.26558: done with get_vars() 12755 1727204136.26599: variable 'ansible_search_path' from source: unknown 12755 1727204136.26601: variable 'ansible_search_path' from source: unknown 12755 1727204136.26664: we have included files to process 12755 1727204136.26666: generating all_blocks data 12755 1727204136.26668: done generating all_blocks data 12755 1727204136.26677: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12755 1727204136.26679: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12755 1727204136.26682: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12755 1727204136.27652: done processing included file 12755 1727204136.27655: iterating over new_blocks loaded from include file 12755 1727204136.27657: in VariableManager get_vars() 12755 1727204136.27721: done with get_vars() 12755 1727204136.27724: filtering new block on tags 12755 1727204136.27772: done filtering new block on tags 12755 1727204136.27776: in VariableManager get_vars() 12755 1727204136.27829: done with get_vars() 12755 1727204136.27831: filtering new block on tags 12755 1727204136.27897: done filtering new block on tags 12755 1727204136.27906: in VariableManager get_vars() 12755 1727204136.27951: done with get_vars() 12755 1727204136.27954: filtering new block on tags 12755 1727204136.28036: done filtering new block on tags 12755 1727204136.28039: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 12755 1727204136.28045: extending task lists for all hosts with included blocks 12755 1727204136.29805: done extending task lists 12755 1727204136.29807: done processing included files 12755 1727204136.29808: results queue empty 12755 1727204136.29811: checking for any_errors_fatal 12755 1727204136.29816: done checking for any_errors_fatal 12755 1727204136.29817: checking for max_fail_percentage 12755 1727204136.29818: done checking for max_fail_percentage 12755 1727204136.29819: checking to see if all hosts have failed and the running result is not ok 12755 1727204136.29821: done checking to see if all hosts have failed 12755 1727204136.29822: getting the remaining hosts for this loop 12755 1727204136.29823: done getting the remaining hosts for this loop 12755 1727204136.29826: getting the next task for host managed-node1 12755 1727204136.29832: done getting next task for host managed-node1 12755 1727204136.29835: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12755 1727204136.29840: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204136.29854: getting variables 12755 1727204136.29855: in VariableManager get_vars() 12755 1727204136.29894: Calling all_inventory to load vars for managed-node1 12755 1727204136.29897: Calling groups_inventory to load vars for managed-node1 12755 1727204136.29900: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204136.29907: Calling all_plugins_play to load vars for managed-node1 12755 1727204136.29913: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204136.29918: Calling groups_plugins_play to load vars for managed-node1 12755 1727204136.32179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204136.36050: done with get_vars() 12755 1727204136.36216: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:55:36 -0400 (0:00:00.199) 0:01:01.601 ***** 12755 1727204136.36540: entering _queue_task() for managed-node1/setup 12755 1727204136.37318: worker is 1 (out of 1 available) 12755 1727204136.37335: exiting _queue_task() for managed-node1/setup 12755 1727204136.37349: done queuing things up, now waiting for results queue to drain 12755 1727204136.37350: waiting for pending results... 12755 1727204136.37834: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12755 1727204136.38301: in run() - task 12b410aa-8751-72e9-1a19-000000000914 12755 1727204136.38305: variable 'ansible_search_path' from source: unknown 12755 1727204136.38311: variable 'ansible_search_path' from source: unknown 12755 1727204136.38316: calling self._execute() 12755 1727204136.38613: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204136.38618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204136.38631: variable 'omit' from source: magic vars 12755 1727204136.39560: variable 'ansible_distribution_major_version' from source: facts 12755 1727204136.39574: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204136.40379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204136.45724: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204136.45808: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204136.45850: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204136.45892: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204136.46145: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204136.46242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204136.46376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204136.46802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204136.46805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204136.46808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204136.46837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204136.46866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204136.47067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204136.47298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204136.47301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204136.47612: variable '__network_required_facts' from source: role '' defaults 12755 1727204136.47616: variable 'ansible_facts' from source: unknown 12755 1727204136.49293: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 12755 1727204136.49298: when evaluation is False, skipping this task 12755 1727204136.49301: _execute() done 12755 1727204136.49307: dumping result to json 12755 1727204136.49312: done dumping result, returning 12755 1727204136.49321: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-72e9-1a19-000000000914] 12755 1727204136.49328: sending task result for task 12b410aa-8751-72e9-1a19-000000000914 12755 1727204136.49438: done sending task result for task 12b410aa-8751-72e9-1a19-000000000914 12755 1727204136.49441: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204136.49501: no more pending results, returning what we have 12755 1727204136.49505: results queue empty 12755 1727204136.49507: checking for any_errors_fatal 12755 1727204136.49508: done checking for any_errors_fatal 12755 1727204136.49511: checking for max_fail_percentage 12755 1727204136.49514: done checking for max_fail_percentage 12755 1727204136.49514: checking to see if all hosts have failed and the running result is not ok 12755 1727204136.49516: done checking to see if all hosts have failed 12755 1727204136.49517: getting the remaining hosts for this loop 12755 1727204136.49518: done getting the remaining hosts for this loop 12755 1727204136.49524: getting the next task for host managed-node1 12755 1727204136.49535: done getting next task for host managed-node1 12755 1727204136.49539: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 12755 1727204136.49546: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204136.49570: getting variables 12755 1727204136.49572: in VariableManager get_vars() 12755 1727204136.49634: Calling all_inventory to load vars for managed-node1 12755 1727204136.49637: Calling groups_inventory to load vars for managed-node1 12755 1727204136.49640: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204136.49652: Calling all_plugins_play to load vars for managed-node1 12755 1727204136.49656: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204136.49659: Calling groups_plugins_play to load vars for managed-node1 12755 1727204136.54324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204136.59527: done with get_vars() 12755 1727204136.59575: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:55:36 -0400 (0:00:00.231) 0:01:01.832 ***** 12755 1727204136.59723: entering _queue_task() for managed-node1/stat 12755 1727204136.60314: worker is 1 (out of 1 available) 12755 1727204136.60326: exiting _queue_task() for managed-node1/stat 12755 1727204136.60339: done queuing things up, now waiting for results queue to drain 12755 1727204136.60342: waiting for pending results... 12755 1727204136.60507: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 12755 1727204136.60768: in run() - task 12b410aa-8751-72e9-1a19-000000000916 12755 1727204136.60802: variable 'ansible_search_path' from source: unknown 12755 1727204136.60825: variable 'ansible_search_path' from source: unknown 12755 1727204136.60869: calling self._execute() 12755 1727204136.61297: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204136.61407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204136.61414: variable 'omit' from source: magic vars 12755 1727204136.62208: variable 'ansible_distribution_major_version' from source: facts 12755 1727204136.62233: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204136.62678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204136.63361: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204136.63508: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204136.63564: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204136.63677: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204136.63896: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204136.63933: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204136.64002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204136.64115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204136.64508: variable '__network_is_ostree' from source: set_fact 12755 1727204136.64514: Evaluated conditional (not __network_is_ostree is defined): False 12755 1727204136.64517: when evaluation is False, skipping this task 12755 1727204136.64521: _execute() done 12755 1727204136.64523: dumping result to json 12755 1727204136.64526: done dumping result, returning 12755 1727204136.64529: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-72e9-1a19-000000000916] 12755 1727204136.64532: sending task result for task 12b410aa-8751-72e9-1a19-000000000916 12755 1727204136.64804: done sending task result for task 12b410aa-8751-72e9-1a19-000000000916 12755 1727204136.64808: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12755 1727204136.64955: no more pending results, returning what we have 12755 1727204136.64959: results queue empty 12755 1727204136.64961: checking for any_errors_fatal 12755 1727204136.64971: done checking for any_errors_fatal 12755 1727204136.64972: checking for max_fail_percentage 12755 1727204136.64974: done checking for max_fail_percentage 12755 1727204136.64975: checking to see if all hosts have failed and the running result is not ok 12755 1727204136.64977: done checking to see if all hosts have failed 12755 1727204136.64978: getting the remaining hosts for this loop 12755 1727204136.64979: done getting the remaining hosts for this loop 12755 1727204136.64985: getting the next task for host managed-node1 12755 1727204136.64997: done getting next task for host managed-node1 12755 1727204136.65001: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12755 1727204136.65008: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204136.65040: getting variables 12755 1727204136.65042: in VariableManager get_vars() 12755 1727204136.65317: Calling all_inventory to load vars for managed-node1 12755 1727204136.65321: Calling groups_inventory to load vars for managed-node1 12755 1727204136.65324: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204136.65336: Calling all_plugins_play to load vars for managed-node1 12755 1727204136.65340: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204136.65344: Calling groups_plugins_play to load vars for managed-node1 12755 1727204136.70098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204136.75238: done with get_vars() 12755 1727204136.75282: done getting variables 12755 1727204136.75365: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:55:36 -0400 (0:00:00.156) 0:01:01.989 ***** 12755 1727204136.75419: entering _queue_task() for managed-node1/set_fact 12755 1727204136.75830: worker is 1 (out of 1 available) 12755 1727204136.75844: exiting _queue_task() for managed-node1/set_fact 12755 1727204136.75861: done queuing things up, now waiting for results queue to drain 12755 1727204136.75863: waiting for pending results... 12755 1727204136.76226: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12755 1727204136.76496: in run() - task 12b410aa-8751-72e9-1a19-000000000917 12755 1727204136.76502: variable 'ansible_search_path' from source: unknown 12755 1727204136.76505: variable 'ansible_search_path' from source: unknown 12755 1727204136.76648: calling self._execute() 12755 1727204136.76678: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204136.76693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204136.76711: variable 'omit' from source: magic vars 12755 1727204136.77192: variable 'ansible_distribution_major_version' from source: facts 12755 1727204136.77218: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204136.77495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204136.77793: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204136.77862: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204136.77913: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204136.77966: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204136.78075: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204136.78173: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204136.78179: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204136.78203: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204136.78322: variable '__network_is_ostree' from source: set_fact 12755 1727204136.78335: Evaluated conditional (not __network_is_ostree is defined): False 12755 1727204136.78346: when evaluation is False, skipping this task 12755 1727204136.78354: _execute() done 12755 1727204136.78363: dumping result to json 12755 1727204136.78372: done dumping result, returning 12755 1727204136.78390: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-72e9-1a19-000000000917] 12755 1727204136.78407: sending task result for task 12b410aa-8751-72e9-1a19-000000000917 skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12755 1727204136.78664: no more pending results, returning what we have 12755 1727204136.78668: results queue empty 12755 1727204136.78670: checking for any_errors_fatal 12755 1727204136.78678: done checking for any_errors_fatal 12755 1727204136.78679: checking for max_fail_percentage 12755 1727204136.78681: done checking for max_fail_percentage 12755 1727204136.78682: checking to see if all hosts have failed and the running result is not ok 12755 1727204136.78684: done checking to see if all hosts have failed 12755 1727204136.78685: getting the remaining hosts for this loop 12755 1727204136.78687: done getting the remaining hosts for this loop 12755 1727204136.78696: getting the next task for host managed-node1 12755 1727204136.78715: done getting next task for host managed-node1 12755 1727204136.78720: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 12755 1727204136.78727: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204136.78755: getting variables 12755 1727204136.78757: in VariableManager get_vars() 12755 1727204136.79029: Calling all_inventory to load vars for managed-node1 12755 1727204136.79033: Calling groups_inventory to load vars for managed-node1 12755 1727204136.79036: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204136.79049: Calling all_plugins_play to load vars for managed-node1 12755 1727204136.79053: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204136.79057: Calling groups_plugins_play to load vars for managed-node1 12755 1727204136.79635: done sending task result for task 12b410aa-8751-72e9-1a19-000000000917 12755 1727204136.79639: WORKER PROCESS EXITING 12755 1727204136.81382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204136.85764: done with get_vars() 12755 1727204136.85804: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:55:36 -0400 (0:00:00.108) 0:01:02.098 ***** 12755 1727204136.86294: entering _queue_task() for managed-node1/service_facts 12755 1727204136.86763: worker is 1 (out of 1 available) 12755 1727204136.86777: exiting _queue_task() for managed-node1/service_facts 12755 1727204136.86793: done queuing things up, now waiting for results queue to drain 12755 1727204136.86794: waiting for pending results... 12755 1727204136.87021: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 12755 1727204136.87399: in run() - task 12b410aa-8751-72e9-1a19-000000000919 12755 1727204136.87403: variable 'ansible_search_path' from source: unknown 12755 1727204136.87407: variable 'ansible_search_path' from source: unknown 12755 1727204136.87415: calling self._execute() 12755 1727204136.87418: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204136.87420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204136.87424: variable 'omit' from source: magic vars 12755 1727204136.87897: variable 'ansible_distribution_major_version' from source: facts 12755 1727204136.87901: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204136.87905: variable 'omit' from source: magic vars 12755 1727204136.87993: variable 'omit' from source: magic vars 12755 1727204136.88042: variable 'omit' from source: magic vars 12755 1727204136.88088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204136.88135: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204136.88159: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204136.88179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204136.88196: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204136.88237: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204136.88241: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204136.88244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204136.88374: Set connection var ansible_connection to ssh 12755 1727204136.88382: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204136.88386: Set connection var ansible_shell_type to sh 12755 1727204136.88404: Set connection var ansible_timeout to 10 12755 1727204136.88413: Set connection var ansible_shell_executable to /bin/sh 12755 1727204136.88419: Set connection var ansible_pipelining to False 12755 1727204136.88451: variable 'ansible_shell_executable' from source: unknown 12755 1727204136.88455: variable 'ansible_connection' from source: unknown 12755 1727204136.88458: variable 'ansible_module_compression' from source: unknown 12755 1727204136.88463: variable 'ansible_shell_type' from source: unknown 12755 1727204136.88465: variable 'ansible_shell_executable' from source: unknown 12755 1727204136.88470: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204136.88476: variable 'ansible_pipelining' from source: unknown 12755 1727204136.88480: variable 'ansible_timeout' from source: unknown 12755 1727204136.88486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204136.88768: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204136.88773: variable 'omit' from source: magic vars 12755 1727204136.88775: starting attempt loop 12755 1727204136.88778: running the handler 12755 1727204136.88780: _low_level_execute_command(): starting 12755 1727204136.88782: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204136.89694: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204136.89701: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204136.89704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204136.89706: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204136.89708: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204136.89776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204136.91706: stdout chunk (state=3): >>>/root <<< 12755 1727204136.91913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204136.92284: stderr chunk (state=3): >>><<< 12755 1727204136.92288: stdout chunk (state=3): >>><<< 12755 1727204136.92318: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204136.92336: _low_level_execute_command(): starting 12755 1727204136.92345: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204136.9231942-16270-34757259215138 `" && echo ansible-tmp-1727204136.9231942-16270-34757259215138="` echo /root/.ansible/tmp/ansible-tmp-1727204136.9231942-16270-34757259215138 `" ) && sleep 0' 12755 1727204136.93875: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204136.93964: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204136.94165: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204136.94597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204136.96354: stdout chunk (state=3): >>>ansible-tmp-1727204136.9231942-16270-34757259215138=/root/.ansible/tmp/ansible-tmp-1727204136.9231942-16270-34757259215138 <<< 12755 1727204136.96706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204136.96712: stdout chunk (state=3): >>><<< 12755 1727204136.96719: stderr chunk (state=3): >>><<< 12755 1727204136.96741: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204136.9231942-16270-34757259215138=/root/.ansible/tmp/ansible-tmp-1727204136.9231942-16270-34757259215138 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204136.96799: variable 'ansible_module_compression' from source: unknown 12755 1727204136.96915: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 12755 1727204136.97072: variable 'ansible_facts' from source: unknown 12755 1727204136.97327: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204136.9231942-16270-34757259215138/AnsiballZ_service_facts.py 12755 1727204136.97915: Sending initial data 12755 1727204136.97918: Sent initial data (161 bytes) 12755 1727204136.99186: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204136.99200: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204136.99215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204136.99496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204136.99502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204136.99505: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204136.99508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204136.99588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204137.01382: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204137.01495: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204137.01499: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmp_fzy9nf5 /root/.ansible/tmp/ansible-tmp-1727204136.9231942-16270-34757259215138/AnsiballZ_service_facts.py <<< 12755 1727204137.01502: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204136.9231942-16270-34757259215138/AnsiballZ_service_facts.py" <<< 12755 1727204137.01686: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmp_fzy9nf5" to remote "/root/.ansible/tmp/ansible-tmp-1727204136.9231942-16270-34757259215138/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204136.9231942-16270-34757259215138/AnsiballZ_service_facts.py" <<< 12755 1727204137.03850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204137.03951: stderr chunk (state=3): >>><<< 12755 1727204137.04066: stdout chunk (state=3): >>><<< 12755 1727204137.04093: done transferring module to remote 12755 1727204137.04107: _low_level_execute_command(): starting 12755 1727204137.04115: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204136.9231942-16270-34757259215138/ /root/.ansible/tmp/ansible-tmp-1727204136.9231942-16270-34757259215138/AnsiballZ_service_facts.py && sleep 0' 12755 1727204137.05773: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204137.05826: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204137.05845: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204137.05925: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204137.06060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204137.08183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204137.08197: stdout chunk (state=3): >>><<< 12755 1727204137.08256: stderr chunk (state=3): >>><<< 12755 1727204137.08301: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204137.08305: _low_level_execute_command(): starting 12755 1727204137.08508: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204136.9231942-16270-34757259215138/AnsiballZ_service_facts.py && sleep 0' 12755 1727204137.09812: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204137.09816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204137.09819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12755 1727204137.09821: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204137.09824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204137.09876: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204137.09892: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204137.10029: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204137.10200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204139.26932: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zr<<< 12755 1727204139.27196: stdout chunk (state=3): >>>am0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"},<<< 12755 1727204139.27203: stdout chunk (state=3): >>> "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "<<< 12755 1727204139.27394: stdout chunk (state=3): >>>systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 12755 1727204139.28748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204139.28752: stdout chunk (state=3): >>><<< 12755 1727204139.28996: stderr chunk (state=3): >>><<< 12755 1727204139.29002: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204139.46534: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204136.9231942-16270-34757259215138/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204139.46543: _low_level_execute_command(): starting 12755 1727204139.46550: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204136.9231942-16270-34757259215138/ > /dev/null 2>&1 && sleep 0' 12755 1727204139.47960: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204139.47977: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204139.47993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204139.48011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204139.48022: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204139.48181: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204139.48315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204139.48395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204139.50598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204139.50605: stdout chunk (state=3): >>><<< 12755 1727204139.50614: stderr chunk (state=3): >>><<< 12755 1727204139.50717: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204139.50724: handler run complete 12755 1727204139.51417: variable 'ansible_facts' from source: unknown 12755 1727204139.52002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204139.53778: variable 'ansible_facts' from source: unknown 12755 1727204139.54067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204139.54473: attempt loop complete, returning result 12755 1727204139.54481: _execute() done 12755 1727204139.54484: dumping result to json 12755 1727204139.54578: done dumping result, returning 12755 1727204139.54795: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-72e9-1a19-000000000919] 12755 1727204139.54798: sending task result for task 12b410aa-8751-72e9-1a19-000000000919 12755 1727204139.68798: done sending task result for task 12b410aa-8751-72e9-1a19-000000000919 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204139.68860: no more pending results, returning what we have 12755 1727204139.68863: results queue empty 12755 1727204139.68864: checking for any_errors_fatal 12755 1727204139.68867: done checking for any_errors_fatal 12755 1727204139.68868: checking for max_fail_percentage 12755 1727204139.68870: done checking for max_fail_percentage 12755 1727204139.68871: checking to see if all hosts have failed and the running result is not ok 12755 1727204139.68872: done checking to see if all hosts have failed 12755 1727204139.68873: getting the remaining hosts for this loop 12755 1727204139.68874: done getting the remaining hosts for this loop 12755 1727204139.68878: getting the next task for host managed-node1 12755 1727204139.68884: done getting next task for host managed-node1 12755 1727204139.68887: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 12755 1727204139.68895: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204139.68912: WORKER PROCESS EXITING 12755 1727204139.69143: getting variables 12755 1727204139.69145: in VariableManager get_vars() 12755 1727204139.69180: Calling all_inventory to load vars for managed-node1 12755 1727204139.69183: Calling groups_inventory to load vars for managed-node1 12755 1727204139.69186: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204139.69196: Calling all_plugins_play to load vars for managed-node1 12755 1727204139.69199: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204139.69204: Calling groups_plugins_play to load vars for managed-node1 12755 1727204139.73500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204139.79798: done with get_vars() 12755 1727204139.79850: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:55:39 -0400 (0:00:02.936) 0:01:05.035 ***** 12755 1727204139.79968: entering _queue_task() for managed-node1/package_facts 12755 1727204139.80949: worker is 1 (out of 1 available) 12755 1727204139.80963: exiting _queue_task() for managed-node1/package_facts 12755 1727204139.80976: done queuing things up, now waiting for results queue to drain 12755 1727204139.80978: waiting for pending results... 12755 1727204139.81502: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 12755 1727204139.82012: in run() - task 12b410aa-8751-72e9-1a19-00000000091a 12755 1727204139.82027: variable 'ansible_search_path' from source: unknown 12755 1727204139.82031: variable 'ansible_search_path' from source: unknown 12755 1727204139.82190: calling self._execute() 12755 1727204139.82508: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204139.82516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204139.82520: variable 'omit' from source: magic vars 12755 1727204139.83163: variable 'ansible_distribution_major_version' from source: facts 12755 1727204139.83178: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204139.83185: variable 'omit' from source: magic vars 12755 1727204139.83608: variable 'omit' from source: magic vars 12755 1727204139.83773: variable 'omit' from source: magic vars 12755 1727204139.83874: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204139.83923: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204139.83948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204139.84084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204139.84100: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204139.84137: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204139.84143: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204139.84150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204139.84530: Set connection var ansible_connection to ssh 12755 1727204139.84538: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204139.84541: Set connection var ansible_shell_type to sh 12755 1727204139.84544: Set connection var ansible_timeout to 10 12755 1727204139.84547: Set connection var ansible_shell_executable to /bin/sh 12755 1727204139.84549: Set connection var ansible_pipelining to False 12755 1727204139.84574: variable 'ansible_shell_executable' from source: unknown 12755 1727204139.84578: variable 'ansible_connection' from source: unknown 12755 1727204139.84581: variable 'ansible_module_compression' from source: unknown 12755 1727204139.84584: variable 'ansible_shell_type' from source: unknown 12755 1727204139.84587: variable 'ansible_shell_executable' from source: unknown 12755 1727204139.84755: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204139.84759: variable 'ansible_pipelining' from source: unknown 12755 1727204139.84762: variable 'ansible_timeout' from source: unknown 12755 1727204139.84829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204139.85198: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204139.85296: variable 'omit' from source: magic vars 12755 1727204139.85300: starting attempt loop 12755 1727204139.85303: running the handler 12755 1727204139.85306: _low_level_execute_command(): starting 12755 1727204139.85311: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204139.86048: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204139.86107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204139.86171: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204139.86195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204139.86207: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204139.86287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204139.88349: stdout chunk (state=3): >>>/root <<< 12755 1727204139.88358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204139.88594: stdout chunk (state=3): >>><<< 12755 1727204139.88598: stderr chunk (state=3): >>><<< 12755 1727204139.88602: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204139.88605: _low_level_execute_command(): starting 12755 1727204139.88608: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204139.8843174-16494-13373241750344 `" && echo ansible-tmp-1727204139.8843174-16494-13373241750344="` echo /root/.ansible/tmp/ansible-tmp-1727204139.8843174-16494-13373241750344 `" ) && sleep 0' 12755 1727204139.89221: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204139.89255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204139.89267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204139.89369: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204139.89378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204139.89455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204139.91621: stdout chunk (state=3): >>>ansible-tmp-1727204139.8843174-16494-13373241750344=/root/.ansible/tmp/ansible-tmp-1727204139.8843174-16494-13373241750344 <<< 12755 1727204139.91811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204139.91842: stdout chunk (state=3): >>><<< 12755 1727204139.91945: stderr chunk (state=3): >>><<< 12755 1727204139.91999: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204139.8843174-16494-13373241750344=/root/.ansible/tmp/ansible-tmp-1727204139.8843174-16494-13373241750344 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204139.92287: variable 'ansible_module_compression' from source: unknown 12755 1727204139.92374: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 12755 1727204139.92553: variable 'ansible_facts' from source: unknown 12755 1727204139.93092: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204139.8843174-16494-13373241750344/AnsiballZ_package_facts.py 12755 1727204139.93468: Sending initial data 12755 1727204139.93478: Sent initial data (161 bytes) 12755 1727204139.94909: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204139.95027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204139.95042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204139.95058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204139.95207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204139.96915: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12755 1727204139.97023: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204139.97065: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204139.97145: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpgpllxix9 /root/.ansible/tmp/ansible-tmp-1727204139.8843174-16494-13373241750344/AnsiballZ_package_facts.py <<< 12755 1727204139.97149: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204139.8843174-16494-13373241750344/AnsiballZ_package_facts.py" <<< 12755 1727204139.97272: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpgpllxix9" to remote "/root/.ansible/tmp/ansible-tmp-1727204139.8843174-16494-13373241750344/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204139.8843174-16494-13373241750344/AnsiballZ_package_facts.py" <<< 12755 1727204140.04138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204140.04166: stderr chunk (state=3): >>><<< 12755 1727204140.04170: stdout chunk (state=3): >>><<< 12755 1727204140.04256: done transferring module to remote 12755 1727204140.04260: _low_level_execute_command(): starting 12755 1727204140.04262: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204139.8843174-16494-13373241750344/ /root/.ansible/tmp/ansible-tmp-1727204139.8843174-16494-13373241750344/AnsiballZ_package_facts.py && sleep 0' 12755 1727204140.05670: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204140.05675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204140.05677: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 12755 1727204140.05680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204140.05683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204140.05815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204140.05819: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204140.05835: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204140.05907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204140.07991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204140.08077: stderr chunk (state=3): >>><<< 12755 1727204140.08204: stdout chunk (state=3): >>><<< 12755 1727204140.08298: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204140.08302: _low_level_execute_command(): starting 12755 1727204140.08304: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204139.8843174-16494-13373241750344/AnsiballZ_package_facts.py && sleep 0' 12755 1727204140.09465: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204140.09480: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204140.09596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204140.09833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204140.09884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204140.77965: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 12755 1727204140.78034: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": <<< 12755 1727204140.78058: stdout chunk (state=3): >>>"rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "relea<<< 12755 1727204140.78071: stdout chunk (state=3): >>>se": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 12755 1727204140.78123: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb",<<< 12755 1727204140.78149: stdout chunk (state=3): >>> "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.<<< 12755 1727204140.78177: stdout chunk (state=3): >>>fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5<<< 12755 1727204140.78184: stdout chunk (state=3): >>>", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", <<< 12755 1727204140.78224: stdout chunk (state=3): >>>"source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch<<< 12755 1727204140.78247: stdout chunk (state=3): >>>": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_<<< 12755 1727204140.78260: stdout chunk (state=3): >>>64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 12755 1727204140.80376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204140.80379: stdout chunk (state=3): >>><<< 12755 1727204140.80382: stderr chunk (state=3): >>><<< 12755 1727204140.80606: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204140.83094: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204139.8843174-16494-13373241750344/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204140.83129: _low_level_execute_command(): starting 12755 1727204140.83138: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204139.8843174-16494-13373241750344/ > /dev/null 2>&1 && sleep 0' 12755 1727204140.83829: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204140.83864: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204140.83877: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204140.83887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204140.83971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204140.86069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204140.86120: stderr chunk (state=3): >>><<< 12755 1727204140.86124: stdout chunk (state=3): >>><<< 12755 1727204140.86137: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204140.86144: handler run complete 12755 1727204140.87146: variable 'ansible_facts' from source: unknown 12755 1727204140.88000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204140.91472: variable 'ansible_facts' from source: unknown 12755 1727204140.91925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204140.93230: attempt loop complete, returning result 12755 1727204140.93234: _execute() done 12755 1727204140.93236: dumping result to json 12755 1727204140.93596: done dumping result, returning 12755 1727204140.93600: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-72e9-1a19-00000000091a] 12755 1727204140.93603: sending task result for task 12b410aa-8751-72e9-1a19-00000000091a ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204140.97023: done sending task result for task 12b410aa-8751-72e9-1a19-00000000091a 12755 1727204140.97034: no more pending results, returning what we have 12755 1727204140.97037: results queue empty 12755 1727204140.97038: checking for any_errors_fatal 12755 1727204140.97041: done checking for any_errors_fatal 12755 1727204140.97042: checking for max_fail_percentage 12755 1727204140.97043: done checking for max_fail_percentage 12755 1727204140.97044: checking to see if all hosts have failed and the running result is not ok 12755 1727204140.97044: done checking to see if all hosts have failed 12755 1727204140.97045: getting the remaining hosts for this loop 12755 1727204140.97046: done getting the remaining hosts for this loop 12755 1727204140.97049: getting the next task for host managed-node1 12755 1727204140.97055: done getting next task for host managed-node1 12755 1727204140.97058: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12755 1727204140.97061: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204140.97069: WORKER PROCESS EXITING 12755 1727204140.97077: getting variables 12755 1727204140.97079: in VariableManager get_vars() 12755 1727204140.97119: Calling all_inventory to load vars for managed-node1 12755 1727204140.97122: Calling groups_inventory to load vars for managed-node1 12755 1727204140.97124: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204140.97132: Calling all_plugins_play to load vars for managed-node1 12755 1727204140.97134: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204140.97136: Calling groups_plugins_play to load vars for managed-node1 12755 1727204140.98265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204141.00447: done with get_vars() 12755 1727204141.00482: done getting variables 12755 1727204141.00556: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:55:41 -0400 (0:00:01.206) 0:01:06.241 ***** 12755 1727204141.00606: entering _queue_task() for managed-node1/debug 12755 1727204141.01162: worker is 1 (out of 1 available) 12755 1727204141.01176: exiting _queue_task() for managed-node1/debug 12755 1727204141.01192: done queuing things up, now waiting for results queue to drain 12755 1727204141.01194: waiting for pending results... 12755 1727204141.01632: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 12755 1727204141.01695: in run() - task 12b410aa-8751-72e9-1a19-00000000016d 12755 1727204141.01725: variable 'ansible_search_path' from source: unknown 12755 1727204141.01733: variable 'ansible_search_path' from source: unknown 12755 1727204141.01780: calling self._execute() 12755 1727204141.01913: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204141.01933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204141.01952: variable 'omit' from source: magic vars 12755 1727204141.02433: variable 'ansible_distribution_major_version' from source: facts 12755 1727204141.02470: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204141.02473: variable 'omit' from source: magic vars 12755 1727204141.02572: variable 'omit' from source: magic vars 12755 1727204141.02714: variable 'network_provider' from source: set_fact 12755 1727204141.02743: variable 'omit' from source: magic vars 12755 1727204141.02798: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204141.02849: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204141.02880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204141.02916: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204141.02938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204141.02975: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204141.02984: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204141.02996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204141.03137: Set connection var ansible_connection to ssh 12755 1727204141.03153: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204141.03161: Set connection var ansible_shell_type to sh 12755 1727204141.03181: Set connection var ansible_timeout to 10 12755 1727204141.03196: Set connection var ansible_shell_executable to /bin/sh 12755 1727204141.03211: Set connection var ansible_pipelining to False 12755 1727204141.03248: variable 'ansible_shell_executable' from source: unknown 12755 1727204141.03258: variable 'ansible_connection' from source: unknown 12755 1727204141.03267: variable 'ansible_module_compression' from source: unknown 12755 1727204141.03275: variable 'ansible_shell_type' from source: unknown 12755 1727204141.03284: variable 'ansible_shell_executable' from source: unknown 12755 1727204141.03344: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204141.03347: variable 'ansible_pipelining' from source: unknown 12755 1727204141.03350: variable 'ansible_timeout' from source: unknown 12755 1727204141.03352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204141.03512: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204141.03533: variable 'omit' from source: magic vars 12755 1727204141.03543: starting attempt loop 12755 1727204141.03552: running the handler 12755 1727204141.03614: handler run complete 12755 1727204141.03669: attempt loop complete, returning result 12755 1727204141.03672: _execute() done 12755 1727204141.03675: dumping result to json 12755 1727204141.03678: done dumping result, returning 12755 1727204141.03680: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-72e9-1a19-00000000016d] 12755 1727204141.03683: sending task result for task 12b410aa-8751-72e9-1a19-00000000016d ok: [managed-node1] => {} MSG: Using network provider: nm 12755 1727204141.04004: no more pending results, returning what we have 12755 1727204141.04011: results queue empty 12755 1727204141.04013: checking for any_errors_fatal 12755 1727204141.04024: done checking for any_errors_fatal 12755 1727204141.04025: checking for max_fail_percentage 12755 1727204141.04026: done checking for max_fail_percentage 12755 1727204141.04027: checking to see if all hosts have failed and the running result is not ok 12755 1727204141.04028: done checking to see if all hosts have failed 12755 1727204141.04029: getting the remaining hosts for this loop 12755 1727204141.04031: done getting the remaining hosts for this loop 12755 1727204141.04036: getting the next task for host managed-node1 12755 1727204141.04044: done getting next task for host managed-node1 12755 1727204141.04048: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12755 1727204141.04052: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204141.04068: getting variables 12755 1727204141.04070: in VariableManager get_vars() 12755 1727204141.04333: Calling all_inventory to load vars for managed-node1 12755 1727204141.04337: Calling groups_inventory to load vars for managed-node1 12755 1727204141.04340: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204141.04352: Calling all_plugins_play to load vars for managed-node1 12755 1727204141.04357: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204141.04361: Calling groups_plugins_play to load vars for managed-node1 12755 1727204141.05006: done sending task result for task 12b410aa-8751-72e9-1a19-00000000016d 12755 1727204141.05013: WORKER PROCESS EXITING 12755 1727204141.06594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204141.09592: done with get_vars() 12755 1727204141.09634: done getting variables 12755 1727204141.09711: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:55:41 -0400 (0:00:00.091) 0:01:06.333 ***** 12755 1727204141.09756: entering _queue_task() for managed-node1/fail 12755 1727204141.10118: worker is 1 (out of 1 available) 12755 1727204141.10132: exiting _queue_task() for managed-node1/fail 12755 1727204141.10146: done queuing things up, now waiting for results queue to drain 12755 1727204141.10148: waiting for pending results... 12755 1727204141.10482: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12755 1727204141.10704: in run() - task 12b410aa-8751-72e9-1a19-00000000016e 12755 1727204141.10733: variable 'ansible_search_path' from source: unknown 12755 1727204141.10744: variable 'ansible_search_path' from source: unknown 12755 1727204141.10792: calling self._execute() 12755 1727204141.10918: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204141.10931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204141.10953: variable 'omit' from source: magic vars 12755 1727204141.11416: variable 'ansible_distribution_major_version' from source: facts 12755 1727204141.11436: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204141.11592: variable 'network_state' from source: role '' defaults 12755 1727204141.11617: Evaluated conditional (network_state != {}): False 12755 1727204141.11627: when evaluation is False, skipping this task 12755 1727204141.11635: _execute() done 12755 1727204141.11644: dumping result to json 12755 1727204141.11653: done dumping result, returning 12755 1727204141.11666: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-72e9-1a19-00000000016e] 12755 1727204141.11679: sending task result for task 12b410aa-8751-72e9-1a19-00000000016e skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204141.11947: no more pending results, returning what we have 12755 1727204141.11951: results queue empty 12755 1727204141.11953: checking for any_errors_fatal 12755 1727204141.11961: done checking for any_errors_fatal 12755 1727204141.11963: checking for max_fail_percentage 12755 1727204141.11965: done checking for max_fail_percentage 12755 1727204141.11966: checking to see if all hosts have failed and the running result is not ok 12755 1727204141.11967: done checking to see if all hosts have failed 12755 1727204141.11968: getting the remaining hosts for this loop 12755 1727204141.11969: done getting the remaining hosts for this loop 12755 1727204141.11975: getting the next task for host managed-node1 12755 1727204141.11984: done getting next task for host managed-node1 12755 1727204141.11992: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12755 1727204141.11998: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204141.12030: getting variables 12755 1727204141.12033: in VariableManager get_vars() 12755 1727204141.12395: Calling all_inventory to load vars for managed-node1 12755 1727204141.12399: Calling groups_inventory to load vars for managed-node1 12755 1727204141.12402: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204141.12416: Calling all_plugins_play to load vars for managed-node1 12755 1727204141.12420: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204141.12425: Calling groups_plugins_play to load vars for managed-node1 12755 1727204141.13204: done sending task result for task 12b410aa-8751-72e9-1a19-00000000016e 12755 1727204141.13208: WORKER PROCESS EXITING 12755 1727204141.16123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204141.20450: done with get_vars() 12755 1727204141.20508: done getting variables 12755 1727204141.20602: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:55:41 -0400 (0:00:00.108) 0:01:06.442 ***** 12755 1727204141.20655: entering _queue_task() for managed-node1/fail 12755 1727204141.21084: worker is 1 (out of 1 available) 12755 1727204141.21102: exiting _queue_task() for managed-node1/fail 12755 1727204141.21116: done queuing things up, now waiting for results queue to drain 12755 1727204141.21123: waiting for pending results... 12755 1727204141.21348: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12755 1727204141.21473: in run() - task 12b410aa-8751-72e9-1a19-00000000016f 12755 1727204141.21486: variable 'ansible_search_path' from source: unknown 12755 1727204141.21491: variable 'ansible_search_path' from source: unknown 12755 1727204141.21532: calling self._execute() 12755 1727204141.21629: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204141.21637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204141.21648: variable 'omit' from source: magic vars 12755 1727204141.22096: variable 'ansible_distribution_major_version' from source: facts 12755 1727204141.22118: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204141.22294: variable 'network_state' from source: role '' defaults 12755 1727204141.22327: Evaluated conditional (network_state != {}): False 12755 1727204141.22337: when evaluation is False, skipping this task 12755 1727204141.22346: _execute() done 12755 1727204141.22356: dumping result to json 12755 1727204141.22364: done dumping result, returning 12755 1727204141.22378: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-72e9-1a19-00000000016f] 12755 1727204141.22413: sending task result for task 12b410aa-8751-72e9-1a19-00000000016f skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204141.22598: no more pending results, returning what we have 12755 1727204141.22603: results queue empty 12755 1727204141.22604: checking for any_errors_fatal 12755 1727204141.22615: done checking for any_errors_fatal 12755 1727204141.22616: checking for max_fail_percentage 12755 1727204141.22618: done checking for max_fail_percentage 12755 1727204141.22619: checking to see if all hosts have failed and the running result is not ok 12755 1727204141.22620: done checking to see if all hosts have failed 12755 1727204141.22621: getting the remaining hosts for this loop 12755 1727204141.22623: done getting the remaining hosts for this loop 12755 1727204141.22628: getting the next task for host managed-node1 12755 1727204141.22637: done getting next task for host managed-node1 12755 1727204141.22643: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12755 1727204141.22649: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204141.22679: getting variables 12755 1727204141.22681: in VariableManager get_vars() 12755 1727204141.23200: Calling all_inventory to load vars for managed-node1 12755 1727204141.23204: Calling groups_inventory to load vars for managed-node1 12755 1727204141.23208: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204141.23223: Calling all_plugins_play to load vars for managed-node1 12755 1727204141.23227: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204141.23347: done sending task result for task 12b410aa-8751-72e9-1a19-00000000016f 12755 1727204141.23351: WORKER PROCESS EXITING 12755 1727204141.23382: Calling groups_plugins_play to load vars for managed-node1 12755 1727204141.26655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204141.32753: done with get_vars() 12755 1727204141.32821: done getting variables 12755 1727204141.33225: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:55:41 -0400 (0:00:00.126) 0:01:06.569 ***** 12755 1727204141.33345: entering _queue_task() for managed-node1/fail 12755 1727204141.34219: worker is 1 (out of 1 available) 12755 1727204141.34235: exiting _queue_task() for managed-node1/fail 12755 1727204141.34249: done queuing things up, now waiting for results queue to drain 12755 1727204141.34251: waiting for pending results... 12755 1727204141.35050: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12755 1727204141.35327: in run() - task 12b410aa-8751-72e9-1a19-000000000170 12755 1727204141.35457: variable 'ansible_search_path' from source: unknown 12755 1727204141.35461: variable 'ansible_search_path' from source: unknown 12755 1727204141.35508: calling self._execute() 12755 1727204141.35750: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204141.35759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204141.35888: variable 'omit' from source: magic vars 12755 1727204141.36976: variable 'ansible_distribution_major_version' from source: facts 12755 1727204141.36981: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204141.37613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204141.43586: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204141.43895: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204141.43899: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204141.43996: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204141.44000: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204141.44347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204141.44351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204141.44382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204141.44737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204141.44742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204141.44782: variable 'ansible_distribution_major_version' from source: facts 12755 1727204141.45006: Evaluated conditional (ansible_distribution_major_version | int > 9): True 12755 1727204141.45161: variable 'ansible_distribution' from source: facts 12755 1727204141.45165: variable '__network_rh_distros' from source: role '' defaults 12755 1727204141.45191: Evaluated conditional (ansible_distribution in __network_rh_distros): False 12755 1727204141.45194: when evaluation is False, skipping this task 12755 1727204141.45196: _execute() done 12755 1727204141.45398: dumping result to json 12755 1727204141.45404: done dumping result, returning 12755 1727204141.45415: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-72e9-1a19-000000000170] 12755 1727204141.45422: sending task result for task 12b410aa-8751-72e9-1a19-000000000170 12755 1727204141.45738: done sending task result for task 12b410aa-8751-72e9-1a19-000000000170 12755 1727204141.45742: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 12755 1727204141.45820: no more pending results, returning what we have 12755 1727204141.45825: results queue empty 12755 1727204141.45827: checking for any_errors_fatal 12755 1727204141.45833: done checking for any_errors_fatal 12755 1727204141.45834: checking for max_fail_percentage 12755 1727204141.45839: done checking for max_fail_percentage 12755 1727204141.45840: checking to see if all hosts have failed and the running result is not ok 12755 1727204141.45842: done checking to see if all hosts have failed 12755 1727204141.45845: getting the remaining hosts for this loop 12755 1727204141.45846: done getting the remaining hosts for this loop 12755 1727204141.45854: getting the next task for host managed-node1 12755 1727204141.45905: done getting next task for host managed-node1 12755 1727204141.45912: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12755 1727204141.45917: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204141.45939: getting variables 12755 1727204141.45942: in VariableManager get_vars() 12755 1727204141.46124: Calling all_inventory to load vars for managed-node1 12755 1727204141.46128: Calling groups_inventory to load vars for managed-node1 12755 1727204141.46131: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204141.46142: Calling all_plugins_play to load vars for managed-node1 12755 1727204141.46145: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204141.46149: Calling groups_plugins_play to load vars for managed-node1 12755 1727204141.53041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204141.59572: done with get_vars() 12755 1727204141.59671: done getting variables 12755 1727204141.59804: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:55:41 -0400 (0:00:00.265) 0:01:06.834 ***** 12755 1727204141.59894: entering _queue_task() for managed-node1/dnf 12755 1727204141.60910: worker is 1 (out of 1 available) 12755 1727204141.60922: exiting _queue_task() for managed-node1/dnf 12755 1727204141.60935: done queuing things up, now waiting for results queue to drain 12755 1727204141.60937: waiting for pending results... 12755 1727204141.61423: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12755 1727204141.61451: in run() - task 12b410aa-8751-72e9-1a19-000000000171 12755 1727204141.61472: variable 'ansible_search_path' from source: unknown 12755 1727204141.61480: variable 'ansible_search_path' from source: unknown 12755 1727204141.61542: calling self._execute() 12755 1727204141.61696: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204141.61720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204141.61749: variable 'omit' from source: magic vars 12755 1727204141.62418: variable 'ansible_distribution_major_version' from source: facts 12755 1727204141.62439: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204141.62842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204141.67906: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204141.67915: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204141.68003: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204141.68114: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204141.68186: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204141.68323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204141.68375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204141.68422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204141.68515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204141.68540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204141.68784: variable 'ansible_distribution' from source: facts 12755 1727204141.68856: variable 'ansible_distribution_major_version' from source: facts 12755 1727204141.68863: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 12755 1727204141.69154: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204141.69462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204141.69538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204141.69640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204141.69723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204141.69861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204141.69880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204141.69923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204141.69961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204141.70065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204141.70294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204141.70298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204141.70300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204141.70303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204141.70305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204141.70308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204141.70603: variable 'network_connections' from source: task vars 12755 1727204141.70650: variable 'controller_profile' from source: play vars 12755 1727204141.70794: variable 'controller_profile' from source: play vars 12755 1727204141.70905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204141.71167: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204141.71231: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204141.71272: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204141.71325: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204141.71382: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204141.71494: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204141.71505: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204141.71512: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204141.71628: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204141.71999: variable 'network_connections' from source: task vars 12755 1727204141.72013: variable 'controller_profile' from source: play vars 12755 1727204141.72105: variable 'controller_profile' from source: play vars 12755 1727204141.72142: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12755 1727204141.72151: when evaluation is False, skipping this task 12755 1727204141.72158: _execute() done 12755 1727204141.72170: dumping result to json 12755 1727204141.72183: done dumping result, returning 12755 1727204141.72202: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-72e9-1a19-000000000171] 12755 1727204141.72280: sending task result for task 12b410aa-8751-72e9-1a19-000000000171 12755 1727204141.72367: done sending task result for task 12b410aa-8751-72e9-1a19-000000000171 12755 1727204141.72371: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12755 1727204141.72446: no more pending results, returning what we have 12755 1727204141.72450: results queue empty 12755 1727204141.72452: checking for any_errors_fatal 12755 1727204141.72460: done checking for any_errors_fatal 12755 1727204141.72461: checking for max_fail_percentage 12755 1727204141.72463: done checking for max_fail_percentage 12755 1727204141.72464: checking to see if all hosts have failed and the running result is not ok 12755 1727204141.72465: done checking to see if all hosts have failed 12755 1727204141.72466: getting the remaining hosts for this loop 12755 1727204141.72468: done getting the remaining hosts for this loop 12755 1727204141.72472: getting the next task for host managed-node1 12755 1727204141.72694: done getting next task for host managed-node1 12755 1727204141.72700: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12755 1727204141.72704: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204141.72730: getting variables 12755 1727204141.72732: in VariableManager get_vars() 12755 1727204141.72858: Calling all_inventory to load vars for managed-node1 12755 1727204141.72862: Calling groups_inventory to load vars for managed-node1 12755 1727204141.72865: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204141.72878: Calling all_plugins_play to load vars for managed-node1 12755 1727204141.72900: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204141.72915: Calling groups_plugins_play to load vars for managed-node1 12755 1727204141.76736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204141.79774: done with get_vars() 12755 1727204141.79810: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12755 1727204141.79908: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:55:41 -0400 (0:00:00.200) 0:01:07.035 ***** 12755 1727204141.79954: entering _queue_task() for managed-node1/yum 12755 1727204141.80268: worker is 1 (out of 1 available) 12755 1727204141.80283: exiting _queue_task() for managed-node1/yum 12755 1727204141.80300: done queuing things up, now waiting for results queue to drain 12755 1727204141.80302: waiting for pending results... 12755 1727204141.80813: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12755 1727204141.80819: in run() - task 12b410aa-8751-72e9-1a19-000000000172 12755 1727204141.80823: variable 'ansible_search_path' from source: unknown 12755 1727204141.80825: variable 'ansible_search_path' from source: unknown 12755 1727204141.80829: calling self._execute() 12755 1727204141.80972: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204141.80976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204141.81011: variable 'omit' from source: magic vars 12755 1727204141.81534: variable 'ansible_distribution_major_version' from source: facts 12755 1727204141.81558: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204141.81827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204141.83924: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204141.83980: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204141.84016: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204141.84046: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204141.84071: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204141.84143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204141.84179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204141.84207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204141.84244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204141.84258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204141.84356: variable 'ansible_distribution_major_version' from source: facts 12755 1727204141.84372: Evaluated conditional (ansible_distribution_major_version | int < 8): False 12755 1727204141.84375: when evaluation is False, skipping this task 12755 1727204141.84384: _execute() done 12755 1727204141.84391: dumping result to json 12755 1727204141.84395: done dumping result, returning 12755 1727204141.84428: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-72e9-1a19-000000000172] 12755 1727204141.84434: sending task result for task 12b410aa-8751-72e9-1a19-000000000172 12755 1727204141.84652: done sending task result for task 12b410aa-8751-72e9-1a19-000000000172 12755 1727204141.84656: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 12755 1727204141.84854: no more pending results, returning what we have 12755 1727204141.84858: results queue empty 12755 1727204141.84859: checking for any_errors_fatal 12755 1727204141.84866: done checking for any_errors_fatal 12755 1727204141.84867: checking for max_fail_percentage 12755 1727204141.84869: done checking for max_fail_percentage 12755 1727204141.84870: checking to see if all hosts have failed and the running result is not ok 12755 1727204141.84871: done checking to see if all hosts have failed 12755 1727204141.84872: getting the remaining hosts for this loop 12755 1727204141.84873: done getting the remaining hosts for this loop 12755 1727204141.84877: getting the next task for host managed-node1 12755 1727204141.84885: done getting next task for host managed-node1 12755 1727204141.84892: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12755 1727204141.84896: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204141.84921: getting variables 12755 1727204141.84923: in VariableManager get_vars() 12755 1727204141.84978: Calling all_inventory to load vars for managed-node1 12755 1727204141.84981: Calling groups_inventory to load vars for managed-node1 12755 1727204141.84985: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204141.85008: Calling all_plugins_play to load vars for managed-node1 12755 1727204141.85013: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204141.85017: Calling groups_plugins_play to load vars for managed-node1 12755 1727204141.87232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204141.90929: done with get_vars() 12755 1727204141.90967: done getting variables 12755 1727204141.91140: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:55:41 -0400 (0:00:00.112) 0:01:07.147 ***** 12755 1727204141.91185: entering _queue_task() for managed-node1/fail 12755 1727204141.91775: worker is 1 (out of 1 available) 12755 1727204141.92091: exiting _queue_task() for managed-node1/fail 12755 1727204141.92107: done queuing things up, now waiting for results queue to drain 12755 1727204141.92108: waiting for pending results... 12755 1727204141.92270: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12755 1727204141.92370: in run() - task 12b410aa-8751-72e9-1a19-000000000173 12755 1727204141.92385: variable 'ansible_search_path' from source: unknown 12755 1727204141.92389: variable 'ansible_search_path' from source: unknown 12755 1727204141.92425: calling self._execute() 12755 1727204141.92519: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204141.92527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204141.92537: variable 'omit' from source: magic vars 12755 1727204141.92879: variable 'ansible_distribution_major_version' from source: facts 12755 1727204141.92892: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204141.92998: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204141.93179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204141.95797: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204141.95801: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204141.95804: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204141.95827: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204141.95860: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204141.95972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204141.96028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204141.96059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204141.96123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204141.96140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204141.96212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204141.96239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204141.96268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204141.96594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204141.96598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204141.96601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204141.96603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204141.96606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204141.96608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204141.96610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204141.96773: variable 'network_connections' from source: task vars 12755 1727204141.96794: variable 'controller_profile' from source: play vars 12755 1727204141.96883: variable 'controller_profile' from source: play vars 12755 1727204141.96993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204141.97223: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204141.97280: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204141.97323: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204141.97357: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204141.97428: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204141.97455: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204141.97510: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204141.97545: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204141.97627: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204141.98000: variable 'network_connections' from source: task vars 12755 1727204141.98008: variable 'controller_profile' from source: play vars 12755 1727204141.98103: variable 'controller_profile' from source: play vars 12755 1727204141.98136: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12755 1727204141.98140: when evaluation is False, skipping this task 12755 1727204141.98148: _execute() done 12755 1727204141.98153: dumping result to json 12755 1727204141.98165: done dumping result, returning 12755 1727204141.98176: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-72e9-1a19-000000000173] 12755 1727204141.98182: sending task result for task 12b410aa-8751-72e9-1a19-000000000173 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12755 1727204141.98379: no more pending results, returning what we have 12755 1727204141.98384: results queue empty 12755 1727204141.98385: checking for any_errors_fatal 12755 1727204141.98396: done checking for any_errors_fatal 12755 1727204141.98397: checking for max_fail_percentage 12755 1727204141.98400: done checking for max_fail_percentage 12755 1727204141.98401: checking to see if all hosts have failed and the running result is not ok 12755 1727204141.98402: done checking to see if all hosts have failed 12755 1727204141.98403: getting the remaining hosts for this loop 12755 1727204141.98405: done getting the remaining hosts for this loop 12755 1727204141.98414: getting the next task for host managed-node1 12755 1727204141.98425: done getting next task for host managed-node1 12755 1727204141.98430: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12755 1727204141.98435: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204141.98601: done sending task result for task 12b410aa-8751-72e9-1a19-000000000173 12755 1727204141.98606: WORKER PROCESS EXITING 12755 1727204141.98630: getting variables 12755 1727204141.98632: in VariableManager get_vars() 12755 1727204141.98698: Calling all_inventory to load vars for managed-node1 12755 1727204141.98819: Calling groups_inventory to load vars for managed-node1 12755 1727204141.98823: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204141.98835: Calling all_plugins_play to load vars for managed-node1 12755 1727204141.98839: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204141.98843: Calling groups_plugins_play to load vars for managed-node1 12755 1727204142.02175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204142.04616: done with get_vars() 12755 1727204142.04667: done getting variables 12755 1727204142.04852: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:55:42 -0400 (0:00:00.137) 0:01:07.284 ***** 12755 1727204142.04904: entering _queue_task() for managed-node1/package 12755 1727204142.05770: worker is 1 (out of 1 available) 12755 1727204142.05784: exiting _queue_task() for managed-node1/package 12755 1727204142.05803: done queuing things up, now waiting for results queue to drain 12755 1727204142.05804: waiting for pending results... 12755 1727204142.06022: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 12755 1727204142.06198: in run() - task 12b410aa-8751-72e9-1a19-000000000174 12755 1727204142.06217: variable 'ansible_search_path' from source: unknown 12755 1727204142.06221: variable 'ansible_search_path' from source: unknown 12755 1727204142.06296: calling self._execute() 12755 1727204142.06395: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204142.06404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204142.06448: variable 'omit' from source: magic vars 12755 1727204142.06903: variable 'ansible_distribution_major_version' from source: facts 12755 1727204142.06919: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204142.07221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204142.07533: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204142.07591: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204142.07699: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204142.07776: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204142.08197: variable 'network_packages' from source: role '' defaults 12755 1727204142.08385: variable '__network_provider_setup' from source: role '' defaults 12755 1727204142.08388: variable '__network_service_name_default_nm' from source: role '' defaults 12755 1727204142.08448: variable '__network_service_name_default_nm' from source: role '' defaults 12755 1727204142.08458: variable '__network_packages_default_nm' from source: role '' defaults 12755 1727204142.08541: variable '__network_packages_default_nm' from source: role '' defaults 12755 1727204142.08820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204142.12994: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204142.13001: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204142.13123: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204142.13127: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204142.13129: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204142.13257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204142.13293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204142.13340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204142.13384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204142.13403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204142.13467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204142.13498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204142.13538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204142.13590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204142.13607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204142.13996: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12755 1727204142.14107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204142.14161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204142.14207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204142.14267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204142.14608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204142.14741: variable 'ansible_python' from source: facts 12755 1727204142.14795: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12755 1727204142.15127: variable '__network_wpa_supplicant_required' from source: role '' defaults 12755 1727204142.15130: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12755 1727204142.15581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204142.15616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204142.15728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204142.15894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204142.15920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204142.16097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204142.16129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204142.16158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204142.16245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204142.16264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204142.16671: variable 'network_connections' from source: task vars 12755 1727204142.16678: variable 'controller_profile' from source: play vars 12755 1727204142.17025: variable 'controller_profile' from source: play vars 12755 1727204142.17115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204142.17155: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204142.17390: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204142.17470: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204142.17649: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204142.18573: variable 'network_connections' from source: task vars 12755 1727204142.18581: variable 'controller_profile' from source: play vars 12755 1727204142.18819: variable 'controller_profile' from source: play vars 12755 1727204142.18855: variable '__network_packages_default_wireless' from source: role '' defaults 12755 1727204142.19138: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204142.20200: variable 'network_connections' from source: task vars 12755 1727204142.20203: variable 'controller_profile' from source: play vars 12755 1727204142.20206: variable 'controller_profile' from source: play vars 12755 1727204142.20209: variable '__network_packages_default_team' from source: role '' defaults 12755 1727204142.20396: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204142.21309: variable 'network_connections' from source: task vars 12755 1727204142.21318: variable 'controller_profile' from source: play vars 12755 1727204142.21401: variable 'controller_profile' from source: play vars 12755 1727204142.21608: variable '__network_service_name_default_initscripts' from source: role '' defaults 12755 1727204142.21995: variable '__network_service_name_default_initscripts' from source: role '' defaults 12755 1727204142.21999: variable '__network_packages_default_initscripts' from source: role '' defaults 12755 1727204142.22157: variable '__network_packages_default_initscripts' from source: role '' defaults 12755 1727204142.22774: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12755 1727204142.24238: variable 'network_connections' from source: task vars 12755 1727204142.24245: variable 'controller_profile' from source: play vars 12755 1727204142.24456: variable 'controller_profile' from source: play vars 12755 1727204142.24460: variable 'ansible_distribution' from source: facts 12755 1727204142.24462: variable '__network_rh_distros' from source: role '' defaults 12755 1727204142.24507: variable 'ansible_distribution_major_version' from source: facts 12755 1727204142.24564: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12755 1727204142.24999: variable 'ansible_distribution' from source: facts 12755 1727204142.25003: variable '__network_rh_distros' from source: role '' defaults 12755 1727204142.25005: variable 'ansible_distribution_major_version' from source: facts 12755 1727204142.25008: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12755 1727204142.25617: variable 'ansible_distribution' from source: facts 12755 1727204142.25621: variable '__network_rh_distros' from source: role '' defaults 12755 1727204142.25623: variable 'ansible_distribution_major_version' from source: facts 12755 1727204142.25635: variable 'network_provider' from source: set_fact 12755 1727204142.25655: variable 'ansible_facts' from source: unknown 12755 1727204142.28143: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 12755 1727204142.28148: when evaluation is False, skipping this task 12755 1727204142.28151: _execute() done 12755 1727204142.28153: dumping result to json 12755 1727204142.28209: done dumping result, returning 12755 1727204142.28213: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-72e9-1a19-000000000174] 12755 1727204142.28218: sending task result for task 12b410aa-8751-72e9-1a19-000000000174 12755 1727204142.28508: done sending task result for task 12b410aa-8751-72e9-1a19-000000000174 12755 1727204142.28511: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 12755 1727204142.28581: no more pending results, returning what we have 12755 1727204142.28585: results queue empty 12755 1727204142.28586: checking for any_errors_fatal 12755 1727204142.28598: done checking for any_errors_fatal 12755 1727204142.28599: checking for max_fail_percentage 12755 1727204142.28601: done checking for max_fail_percentage 12755 1727204142.28602: checking to see if all hosts have failed and the running result is not ok 12755 1727204142.28603: done checking to see if all hosts have failed 12755 1727204142.28604: getting the remaining hosts for this loop 12755 1727204142.28605: done getting the remaining hosts for this loop 12755 1727204142.28611: getting the next task for host managed-node1 12755 1727204142.28620: done getting next task for host managed-node1 12755 1727204142.28625: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12755 1727204142.28629: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204142.28658: getting variables 12755 1727204142.28661: in VariableManager get_vars() 12755 1727204142.29125: Calling all_inventory to load vars for managed-node1 12755 1727204142.29129: Calling groups_inventory to load vars for managed-node1 12755 1727204142.29131: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204142.29148: Calling all_plugins_play to load vars for managed-node1 12755 1727204142.29151: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204142.29156: Calling groups_plugins_play to load vars for managed-node1 12755 1727204142.37496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204142.44009: done with get_vars() 12755 1727204142.44061: done getting variables 12755 1727204142.44248: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:55:42 -0400 (0:00:00.394) 0:01:07.679 ***** 12755 1727204142.44406: entering _queue_task() for managed-node1/package 12755 1727204142.45287: worker is 1 (out of 1 available) 12755 1727204142.45303: exiting _queue_task() for managed-node1/package 12755 1727204142.45316: done queuing things up, now waiting for results queue to drain 12755 1727204142.45318: waiting for pending results... 12755 1727204142.46015: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12755 1727204142.46168: in run() - task 12b410aa-8751-72e9-1a19-000000000175 12755 1727204142.46236: variable 'ansible_search_path' from source: unknown 12755 1727204142.46435: variable 'ansible_search_path' from source: unknown 12755 1727204142.46439: calling self._execute() 12755 1727204142.46670: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204142.46684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204142.46706: variable 'omit' from source: magic vars 12755 1727204142.47608: variable 'ansible_distribution_major_version' from source: facts 12755 1727204142.47734: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204142.47797: variable 'network_state' from source: role '' defaults 12755 1727204142.47810: Evaluated conditional (network_state != {}): False 12755 1727204142.47816: when evaluation is False, skipping this task 12755 1727204142.47820: _execute() done 12755 1727204142.47826: dumping result to json 12755 1727204142.47829: done dumping result, returning 12755 1727204142.47843: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-72e9-1a19-000000000175] 12755 1727204142.47850: sending task result for task 12b410aa-8751-72e9-1a19-000000000175 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204142.48072: no more pending results, returning what we have 12755 1727204142.48075: results queue empty 12755 1727204142.48077: checking for any_errors_fatal 12755 1727204142.48083: done checking for any_errors_fatal 12755 1727204142.48084: checking for max_fail_percentage 12755 1727204142.48087: done checking for max_fail_percentage 12755 1727204142.48087: checking to see if all hosts have failed and the running result is not ok 12755 1727204142.48090: done checking to see if all hosts have failed 12755 1727204142.48091: getting the remaining hosts for this loop 12755 1727204142.48093: done getting the remaining hosts for this loop 12755 1727204142.48097: getting the next task for host managed-node1 12755 1727204142.48107: done getting next task for host managed-node1 12755 1727204142.48112: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12755 1727204142.48116: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204142.48139: getting variables 12755 1727204142.48141: in VariableManager get_vars() 12755 1727204142.48400: Calling all_inventory to load vars for managed-node1 12755 1727204142.48404: Calling groups_inventory to load vars for managed-node1 12755 1727204142.48407: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204142.48419: Calling all_plugins_play to load vars for managed-node1 12755 1727204142.48422: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204142.48426: Calling groups_plugins_play to load vars for managed-node1 12755 1727204142.49107: done sending task result for task 12b410aa-8751-72e9-1a19-000000000175 12755 1727204142.49111: WORKER PROCESS EXITING 12755 1727204142.53124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204142.58836: done with get_vars() 12755 1727204142.58879: done getting variables 12755 1727204142.59039: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:55:42 -0400 (0:00:00.146) 0:01:07.826 ***** 12755 1727204142.59086: entering _queue_task() for managed-node1/package 12755 1727204142.59686: worker is 1 (out of 1 available) 12755 1727204142.59702: exiting _queue_task() for managed-node1/package 12755 1727204142.59721: done queuing things up, now waiting for results queue to drain 12755 1727204142.59723: waiting for pending results... 12755 1727204142.60067: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12755 1727204142.60475: in run() - task 12b410aa-8751-72e9-1a19-000000000176 12755 1727204142.60895: variable 'ansible_search_path' from source: unknown 12755 1727204142.60899: variable 'ansible_search_path' from source: unknown 12755 1727204142.60901: calling self._execute() 12755 1727204142.60905: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204142.60959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204142.60993: variable 'omit' from source: magic vars 12755 1727204142.61515: variable 'ansible_distribution_major_version' from source: facts 12755 1727204142.61538: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204142.61700: variable 'network_state' from source: role '' defaults 12755 1727204142.61721: Evaluated conditional (network_state != {}): False 12755 1727204142.61734: when evaluation is False, skipping this task 12755 1727204142.61743: _execute() done 12755 1727204142.61751: dumping result to json 12755 1727204142.61759: done dumping result, returning 12755 1727204142.61771: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-72e9-1a19-000000000176] 12755 1727204142.61781: sending task result for task 12b410aa-8751-72e9-1a19-000000000176 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204142.61970: no more pending results, returning what we have 12755 1727204142.61974: results queue empty 12755 1727204142.61976: checking for any_errors_fatal 12755 1727204142.61983: done checking for any_errors_fatal 12755 1727204142.61984: checking for max_fail_percentage 12755 1727204142.61992: done checking for max_fail_percentage 12755 1727204142.61993: checking to see if all hosts have failed and the running result is not ok 12755 1727204142.61994: done checking to see if all hosts have failed 12755 1727204142.61995: getting the remaining hosts for this loop 12755 1727204142.61997: done getting the remaining hosts for this loop 12755 1727204142.62003: getting the next task for host managed-node1 12755 1727204142.62016: done getting next task for host managed-node1 12755 1727204142.62020: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12755 1727204142.62026: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204142.62065: getting variables 12755 1727204142.62068: in VariableManager get_vars() 12755 1727204142.62309: Calling all_inventory to load vars for managed-node1 12755 1727204142.62313: Calling groups_inventory to load vars for managed-node1 12755 1727204142.62316: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204142.62328: Calling all_plugins_play to load vars for managed-node1 12755 1727204142.62332: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204142.62336: Calling groups_plugins_play to load vars for managed-node1 12755 1727204142.63026: done sending task result for task 12b410aa-8751-72e9-1a19-000000000176 12755 1727204142.63029: WORKER PROCESS EXITING 12755 1727204142.65228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204142.68340: done with get_vars() 12755 1727204142.68378: done getting variables 12755 1727204142.68448: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:55:42 -0400 (0:00:00.094) 0:01:07.920 ***** 12755 1727204142.68494: entering _queue_task() for managed-node1/service 12755 1727204142.68827: worker is 1 (out of 1 available) 12755 1727204142.68841: exiting _queue_task() for managed-node1/service 12755 1727204142.68855: done queuing things up, now waiting for results queue to drain 12755 1727204142.68857: waiting for pending results... 12755 1727204142.69167: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12755 1727204142.69362: in run() - task 12b410aa-8751-72e9-1a19-000000000177 12755 1727204142.69384: variable 'ansible_search_path' from source: unknown 12755 1727204142.69395: variable 'ansible_search_path' from source: unknown 12755 1727204142.69443: calling self._execute() 12755 1727204142.69562: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204142.69576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204142.69595: variable 'omit' from source: magic vars 12755 1727204142.70043: variable 'ansible_distribution_major_version' from source: facts 12755 1727204142.70063: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204142.70227: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204142.70630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204142.74040: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204142.74134: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204142.74193: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204142.74240: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204142.74280: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204142.74379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204142.74428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204142.74466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204142.74532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204142.74594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204142.74626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204142.74660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204142.74696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204142.74756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204142.74778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204142.74839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204142.74994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204142.74998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204142.75000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204142.75003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204142.75200: variable 'network_connections' from source: task vars 12755 1727204142.75222: variable 'controller_profile' from source: play vars 12755 1727204142.75310: variable 'controller_profile' from source: play vars 12755 1727204142.75409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204142.75662: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204142.75693: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204142.75736: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204142.75778: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204142.75880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204142.75883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204142.75908: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204142.75947: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204142.76014: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204142.76351: variable 'network_connections' from source: task vars 12755 1727204142.76363: variable 'controller_profile' from source: play vars 12755 1727204142.76446: variable 'controller_profile' from source: play vars 12755 1727204142.76532: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12755 1727204142.76536: when evaluation is False, skipping this task 12755 1727204142.76538: _execute() done 12755 1727204142.76540: dumping result to json 12755 1727204142.76543: done dumping result, returning 12755 1727204142.76545: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-72e9-1a19-000000000177] 12755 1727204142.76547: sending task result for task 12b410aa-8751-72e9-1a19-000000000177 12755 1727204142.76806: done sending task result for task 12b410aa-8751-72e9-1a19-000000000177 12755 1727204142.76817: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12755 1727204142.76872: no more pending results, returning what we have 12755 1727204142.76879: results queue empty 12755 1727204142.76880: checking for any_errors_fatal 12755 1727204142.76888: done checking for any_errors_fatal 12755 1727204142.76891: checking for max_fail_percentage 12755 1727204142.76894: done checking for max_fail_percentage 12755 1727204142.76894: checking to see if all hosts have failed and the running result is not ok 12755 1727204142.76896: done checking to see if all hosts have failed 12755 1727204142.76897: getting the remaining hosts for this loop 12755 1727204142.76898: done getting the remaining hosts for this loop 12755 1727204142.76904: getting the next task for host managed-node1 12755 1727204142.76914: done getting next task for host managed-node1 12755 1727204142.76921: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12755 1727204142.76926: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204142.76954: getting variables 12755 1727204142.76956: in VariableManager get_vars() 12755 1727204142.77221: Calling all_inventory to load vars for managed-node1 12755 1727204142.77225: Calling groups_inventory to load vars for managed-node1 12755 1727204142.77228: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204142.77241: Calling all_plugins_play to load vars for managed-node1 12755 1727204142.77247: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204142.77252: Calling groups_plugins_play to load vars for managed-node1 12755 1727204142.79580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204142.82521: done with get_vars() 12755 1727204142.82559: done getting variables 12755 1727204142.82631: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:55:42 -0400 (0:00:00.141) 0:01:08.062 ***** 12755 1727204142.82670: entering _queue_task() for managed-node1/service 12755 1727204142.83048: worker is 1 (out of 1 available) 12755 1727204142.83062: exiting _queue_task() for managed-node1/service 12755 1727204142.83077: done queuing things up, now waiting for results queue to drain 12755 1727204142.83079: waiting for pending results... 12755 1727204142.83401: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12755 1727204142.83596: in run() - task 12b410aa-8751-72e9-1a19-000000000178 12755 1727204142.83620: variable 'ansible_search_path' from source: unknown 12755 1727204142.83697: variable 'ansible_search_path' from source: unknown 12755 1727204142.83700: calling self._execute() 12755 1727204142.83803: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204142.83822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204142.83839: variable 'omit' from source: magic vars 12755 1727204142.84309: variable 'ansible_distribution_major_version' from source: facts 12755 1727204142.84331: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204142.84571: variable 'network_provider' from source: set_fact 12755 1727204142.84685: variable 'network_state' from source: role '' defaults 12755 1727204142.84689: Evaluated conditional (network_provider == "nm" or network_state != {}): True 12755 1727204142.84694: variable 'omit' from source: magic vars 12755 1727204142.84705: variable 'omit' from source: magic vars 12755 1727204142.84746: variable 'network_service_name' from source: role '' defaults 12755 1727204142.84849: variable 'network_service_name' from source: role '' defaults 12755 1727204142.84994: variable '__network_provider_setup' from source: role '' defaults 12755 1727204142.85001: variable '__network_service_name_default_nm' from source: role '' defaults 12755 1727204142.85087: variable '__network_service_name_default_nm' from source: role '' defaults 12755 1727204142.85099: variable '__network_packages_default_nm' from source: role '' defaults 12755 1727204142.85195: variable '__network_packages_default_nm' from source: role '' defaults 12755 1727204142.85617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204142.90320: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204142.90407: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204142.90508: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204142.90521: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204142.90572: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204142.90678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204142.90726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204142.90795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204142.90827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204142.90844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204142.90917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204142.90945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204142.90975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204142.91035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204142.91055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204142.91405: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12755 1727204142.91582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204142.91617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204142.91656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204142.91706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204142.91725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204142.91839: variable 'ansible_python' from source: facts 12755 1727204142.91876: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12755 1727204142.91984: variable '__network_wpa_supplicant_required' from source: role '' defaults 12755 1727204142.92080: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12755 1727204142.92251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204142.92278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204142.92321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204142.92372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204142.92391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204142.92462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204142.92491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204142.92533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204142.92583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204142.92600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204142.92817: variable 'network_connections' from source: task vars 12755 1727204142.92820: variable 'controller_profile' from source: play vars 12755 1727204142.92894: variable 'controller_profile' from source: play vars 12755 1727204142.93243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204142.94104: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204142.94166: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204142.94347: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204142.94460: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204142.94543: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204142.94579: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204142.94624: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204142.94668: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204142.94726: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204142.95133: variable 'network_connections' from source: task vars 12755 1727204142.95243: variable 'controller_profile' from source: play vars 12755 1727204142.95247: variable 'controller_profile' from source: play vars 12755 1727204142.95279: variable '__network_packages_default_wireless' from source: role '' defaults 12755 1727204142.95382: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204142.95783: variable 'network_connections' from source: task vars 12755 1727204142.95787: variable 'controller_profile' from source: play vars 12755 1727204142.95876: variable 'controller_profile' from source: play vars 12755 1727204142.95906: variable '__network_packages_default_team' from source: role '' defaults 12755 1727204142.96016: variable '__network_team_connections_defined' from source: role '' defaults 12755 1727204142.96588: variable 'network_connections' from source: task vars 12755 1727204142.96660: variable 'controller_profile' from source: play vars 12755 1727204142.96833: variable 'controller_profile' from source: play vars 12755 1727204142.96894: variable '__network_service_name_default_initscripts' from source: role '' defaults 12755 1727204142.97084: variable '__network_service_name_default_initscripts' from source: role '' defaults 12755 1727204142.97092: variable '__network_packages_default_initscripts' from source: role '' defaults 12755 1727204142.97229: variable '__network_packages_default_initscripts' from source: role '' defaults 12755 1727204142.97857: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12755 1727204142.99358: variable 'network_connections' from source: task vars 12755 1727204142.99430: variable 'controller_profile' from source: play vars 12755 1727204142.99590: variable 'controller_profile' from source: play vars 12755 1727204142.99595: variable 'ansible_distribution' from source: facts 12755 1727204142.99598: variable '__network_rh_distros' from source: role '' defaults 12755 1727204142.99713: variable 'ansible_distribution_major_version' from source: facts 12755 1727204142.99896: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12755 1727204143.00054: variable 'ansible_distribution' from source: facts 12755 1727204143.00058: variable '__network_rh_distros' from source: role '' defaults 12755 1727204143.00187: variable 'ansible_distribution_major_version' from source: facts 12755 1727204143.00223: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12755 1727204143.00804: variable 'ansible_distribution' from source: facts 12755 1727204143.00808: variable '__network_rh_distros' from source: role '' defaults 12755 1727204143.00810: variable 'ansible_distribution_major_version' from source: facts 12755 1727204143.00845: variable 'network_provider' from source: set_fact 12755 1727204143.00949: variable 'omit' from source: magic vars 12755 1727204143.01077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204143.01118: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204143.01142: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204143.01164: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204143.01178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204143.01495: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204143.01498: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204143.01505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204143.01685: Set connection var ansible_connection to ssh 12755 1727204143.01786: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204143.01792: Set connection var ansible_shell_type to sh 12755 1727204143.01805: Set connection var ansible_timeout to 10 12755 1727204143.01856: Set connection var ansible_shell_executable to /bin/sh 12755 1727204143.01865: Set connection var ansible_pipelining to False 12755 1727204143.01936: variable 'ansible_shell_executable' from source: unknown 12755 1727204143.01940: variable 'ansible_connection' from source: unknown 12755 1727204143.01943: variable 'ansible_module_compression' from source: unknown 12755 1727204143.01946: variable 'ansible_shell_type' from source: unknown 12755 1727204143.01948: variable 'ansible_shell_executable' from source: unknown 12755 1727204143.01950: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204143.01953: variable 'ansible_pipelining' from source: unknown 12755 1727204143.02169: variable 'ansible_timeout' from source: unknown 12755 1727204143.02172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204143.02275: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204143.02283: variable 'omit' from source: magic vars 12755 1727204143.02286: starting attempt loop 12755 1727204143.02296: running the handler 12755 1727204143.02409: variable 'ansible_facts' from source: unknown 12755 1727204143.04024: _low_level_execute_command(): starting 12755 1727204143.04028: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204143.04915: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204143.04996: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204143.05018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204143.05065: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204143.05100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204143.05213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204143.07112: stdout chunk (state=3): >>>/root <<< 12755 1727204143.07435: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204143.07438: stdout chunk (state=3): >>><<< 12755 1727204143.07441: stderr chunk (state=3): >>><<< 12755 1727204143.07444: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204143.07447: _low_level_execute_command(): starting 12755 1727204143.07450: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204143.0734632-16603-266871923589838 `" && echo ansible-tmp-1727204143.0734632-16603-266871923589838="` echo /root/.ansible/tmp/ansible-tmp-1727204143.0734632-16603-266871923589838 `" ) && sleep 0' 12755 1727204143.08576: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204143.08609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204143.08796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204143.11014: stdout chunk (state=3): >>>ansible-tmp-1727204143.0734632-16603-266871923589838=/root/.ansible/tmp/ansible-tmp-1727204143.0734632-16603-266871923589838 <<< 12755 1727204143.11106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204143.11129: stdout chunk (state=3): >>><<< 12755 1727204143.11132: stderr chunk (state=3): >>><<< 12755 1727204143.11216: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204143.0734632-16603-266871923589838=/root/.ansible/tmp/ansible-tmp-1727204143.0734632-16603-266871923589838 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204143.11220: variable 'ansible_module_compression' from source: unknown 12755 1727204143.11381: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 12755 1727204143.11455: variable 'ansible_facts' from source: unknown 12755 1727204143.11720: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204143.0734632-16603-266871923589838/AnsiballZ_systemd.py 12755 1727204143.11995: Sending initial data 12755 1727204143.12059: Sent initial data (156 bytes) 12755 1727204143.12598: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204143.12602: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204143.12605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204143.12707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204143.12710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204143.12715: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204143.13000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204143.14610: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12755 1727204143.14624: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 12755 1727204143.14634: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 12755 1727204143.14650: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 12755 1727204143.14663: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204143.14724: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204143.14783: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpytezzahr /root/.ansible/tmp/ansible-tmp-1727204143.0734632-16603-266871923589838/AnsiballZ_systemd.py <<< 12755 1727204143.14787: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204143.0734632-16603-266871923589838/AnsiballZ_systemd.py" <<< 12755 1727204143.14837: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpytezzahr" to remote "/root/.ansible/tmp/ansible-tmp-1727204143.0734632-16603-266871923589838/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204143.0734632-16603-266871923589838/AnsiballZ_systemd.py" <<< 12755 1727204143.17606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204143.17613: stdout chunk (state=3): >>><<< 12755 1727204143.17615: stderr chunk (state=3): >>><<< 12755 1727204143.17618: done transferring module to remote 12755 1727204143.17620: _low_level_execute_command(): starting 12755 1727204143.17622: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204143.0734632-16603-266871923589838/ /root/.ansible/tmp/ansible-tmp-1727204143.0734632-16603-266871923589838/AnsiballZ_systemd.py && sleep 0' 12755 1727204143.18320: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204143.18337: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204143.18353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204143.18408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204143.18427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204143.18547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204143.18565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204143.18613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204143.21203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204143.21207: stdout chunk (state=3): >>><<< 12755 1727204143.21211: stderr chunk (state=3): >>><<< 12755 1727204143.21365: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204143.21369: _low_level_execute_command(): starting 12755 1727204143.21372: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204143.0734632-16603-266871923589838/AnsiballZ_systemd.py && sleep 0' 12755 1727204143.22261: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204143.22265: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204143.22268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204143.22271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204143.22273: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204143.22275: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204143.22277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204143.22281: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204143.22284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204143.22286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204143.22308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204143.22396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204143.55908: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "651", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ExecMainStartTimestampMonotonic": "17567139", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "651", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "12382208", "MemoryAvailable": "infinity", "CPUUsageNSec": "1319533000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "in<<< 12755 1727204143.55935: stdout chunk (state=3): >>>finity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service multi-user.target cloud-init.service NetworkManager-wait-online.service network.target shutdown.target", "After": "dbus-broker.service system.slice dbus.socket systemd-journald.socket basic.target sysinit.target network-pre.target cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:51 EDT", "StateChangeTimestampMonotonic": "521403753", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:27 EDT", "InactiveExitTimestampMonotonic": "17567399", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ActiveEnterTimestampMonotonic": "18019295", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ConditionTimestampMonotonic": "17554557", "AssertTimestamp": "Tue 2024-09-24 14:45:27 EDT", "AssertTimestampMonotonic": "17554559", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac0fd3fc06b14ac59a7d5e4a43cc5865", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 12755 1727204143.58295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204143.58299: stdout chunk (state=3): >>><<< 12755 1727204143.58302: stderr chunk (state=3): >>><<< 12755 1727204143.58307: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "651", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ExecMainStartTimestampMonotonic": "17567139", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "651", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "12382208", "MemoryAvailable": "infinity", "CPUUsageNSec": "1319533000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service multi-user.target cloud-init.service NetworkManager-wait-online.service network.target shutdown.target", "After": "dbus-broker.service system.slice dbus.socket systemd-journald.socket basic.target sysinit.target network-pre.target cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:51 EDT", "StateChangeTimestampMonotonic": "521403753", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:27 EDT", "InactiveExitTimestampMonotonic": "17567399", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ActiveEnterTimestampMonotonic": "18019295", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ConditionTimestampMonotonic": "17554557", "AssertTimestamp": "Tue 2024-09-24 14:45:27 EDT", "AssertTimestampMonotonic": "17554559", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac0fd3fc06b14ac59a7d5e4a43cc5865", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204143.58442: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204143.0734632-16603-266871923589838/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204143.58464: _low_level_execute_command(): starting 12755 1727204143.58470: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204143.0734632-16603-266871923589838/ > /dev/null 2>&1 && sleep 0' 12755 1727204143.59202: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204143.59280: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204143.59336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204143.59351: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204143.59368: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204143.59452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204143.61506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204143.61510: stdout chunk (state=3): >>><<< 12755 1727204143.61522: stderr chunk (state=3): >>><<< 12755 1727204143.61547: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204143.61556: handler run complete 12755 1727204143.61653: attempt loop complete, returning result 12755 1727204143.61657: _execute() done 12755 1727204143.61660: dumping result to json 12755 1727204143.61684: done dumping result, returning 12755 1727204143.61698: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-72e9-1a19-000000000178] 12755 1727204143.61703: sending task result for task 12b410aa-8751-72e9-1a19-000000000178 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204143.62176: no more pending results, returning what we have 12755 1727204143.62180: results queue empty 12755 1727204143.62182: checking for any_errors_fatal 12755 1727204143.62191: done checking for any_errors_fatal 12755 1727204143.62393: checking for max_fail_percentage 12755 1727204143.62397: done checking for max_fail_percentage 12755 1727204143.62398: checking to see if all hosts have failed and the running result is not ok 12755 1727204143.62399: done checking to see if all hosts have failed 12755 1727204143.62400: getting the remaining hosts for this loop 12755 1727204143.62402: done getting the remaining hosts for this loop 12755 1727204143.62407: getting the next task for host managed-node1 12755 1727204143.62417: done getting next task for host managed-node1 12755 1727204143.62422: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12755 1727204143.62426: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204143.62447: getting variables 12755 1727204143.62449: in VariableManager get_vars() 12755 1727204143.62515: Calling all_inventory to load vars for managed-node1 12755 1727204143.62519: Calling groups_inventory to load vars for managed-node1 12755 1727204143.62522: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204143.62535: Calling all_plugins_play to load vars for managed-node1 12755 1727204143.62539: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204143.62544: Calling groups_plugins_play to load vars for managed-node1 12755 1727204143.63066: done sending task result for task 12b410aa-8751-72e9-1a19-000000000178 12755 1727204143.63070: WORKER PROCESS EXITING 12755 1727204143.64994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204143.73539: done with get_vars() 12755 1727204143.73586: done getting variables 12755 1727204143.73674: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:55:43 -0400 (0:00:00.910) 0:01:08.972 ***** 12755 1727204143.73716: entering _queue_task() for managed-node1/service 12755 1727204143.74159: worker is 1 (out of 1 available) 12755 1727204143.74180: exiting _queue_task() for managed-node1/service 12755 1727204143.74344: done queuing things up, now waiting for results queue to drain 12755 1727204143.74351: waiting for pending results... 12755 1727204143.74734: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12755 1727204143.75084: in run() - task 12b410aa-8751-72e9-1a19-000000000179 12755 1727204143.75092: variable 'ansible_search_path' from source: unknown 12755 1727204143.75097: variable 'ansible_search_path' from source: unknown 12755 1727204143.75154: calling self._execute() 12755 1727204143.75288: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204143.75311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204143.75330: variable 'omit' from source: magic vars 12755 1727204143.76108: variable 'ansible_distribution_major_version' from source: facts 12755 1727204143.76114: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204143.76192: variable 'network_provider' from source: set_fact 12755 1727204143.76219: Evaluated conditional (network_provider == "nm"): True 12755 1727204143.76376: variable '__network_wpa_supplicant_required' from source: role '' defaults 12755 1727204143.76522: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12755 1727204143.76795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204143.79419: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204143.79521: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204143.79572: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204143.79629: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204143.79667: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204143.79775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204143.79822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204143.79864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204143.79927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204143.79954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204143.80022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204143.80072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204143.80112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204143.80280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204143.80284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204143.80287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204143.80290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204143.80325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204143.80395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204143.80422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204143.80620: variable 'network_connections' from source: task vars 12755 1727204143.80641: variable 'controller_profile' from source: play vars 12755 1727204143.80749: variable 'controller_profile' from source: play vars 12755 1727204143.80854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12755 1727204143.81268: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12755 1727204143.81336: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12755 1727204143.81386: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12755 1727204143.81430: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12755 1727204143.81495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12755 1727204143.81532: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12755 1727204143.81739: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204143.81743: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12755 1727204143.81767: variable '__network_wireless_connections_defined' from source: role '' defaults 12755 1727204143.82178: variable 'network_connections' from source: task vars 12755 1727204143.82194: variable 'controller_profile' from source: play vars 12755 1727204143.82279: variable 'controller_profile' from source: play vars 12755 1727204143.82329: Evaluated conditional (__network_wpa_supplicant_required): False 12755 1727204143.82339: when evaluation is False, skipping this task 12755 1727204143.82349: _execute() done 12755 1727204143.82357: dumping result to json 12755 1727204143.82554: done dumping result, returning 12755 1727204143.82559: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-72e9-1a19-000000000179] 12755 1727204143.82571: sending task result for task 12b410aa-8751-72e9-1a19-000000000179 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 12755 1727204143.82753: no more pending results, returning what we have 12755 1727204143.82758: results queue empty 12755 1727204143.82759: checking for any_errors_fatal 12755 1727204143.82793: done checking for any_errors_fatal 12755 1727204143.82795: checking for max_fail_percentage 12755 1727204143.82797: done checking for max_fail_percentage 12755 1727204143.82798: checking to see if all hosts have failed and the running result is not ok 12755 1727204143.82799: done checking to see if all hosts have failed 12755 1727204143.82800: getting the remaining hosts for this loop 12755 1727204143.82802: done getting the remaining hosts for this loop 12755 1727204143.82807: getting the next task for host managed-node1 12755 1727204143.82820: done getting next task for host managed-node1 12755 1727204143.82825: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12755 1727204143.82829: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204143.82859: getting variables 12755 1727204143.82861: in VariableManager get_vars() 12755 1727204143.83129: Calling all_inventory to load vars for managed-node1 12755 1727204143.83133: Calling groups_inventory to load vars for managed-node1 12755 1727204143.83137: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204143.83145: done sending task result for task 12b410aa-8751-72e9-1a19-000000000179 12755 1727204143.83149: WORKER PROCESS EXITING 12755 1727204143.83161: Calling all_plugins_play to load vars for managed-node1 12755 1727204143.83165: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204143.83169: Calling groups_plugins_play to load vars for managed-node1 12755 1727204143.86946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204143.91552: done with get_vars() 12755 1727204143.91612: done getting variables 12755 1727204143.91727: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:55:43 -0400 (0:00:00.180) 0:01:09.153 ***** 12755 1727204143.91769: entering _queue_task() for managed-node1/service 12755 1727204143.92600: worker is 1 (out of 1 available) 12755 1727204143.92618: exiting _queue_task() for managed-node1/service 12755 1727204143.92631: done queuing things up, now waiting for results queue to drain 12755 1727204143.92633: waiting for pending results... 12755 1727204143.93000: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 12755 1727204143.93434: in run() - task 12b410aa-8751-72e9-1a19-00000000017a 12755 1727204143.93462: variable 'ansible_search_path' from source: unknown 12755 1727204143.93472: variable 'ansible_search_path' from source: unknown 12755 1727204143.93521: calling self._execute() 12755 1727204143.93658: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204143.93680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204143.93704: variable 'omit' from source: magic vars 12755 1727204143.94317: variable 'ansible_distribution_major_version' from source: facts 12755 1727204143.94338: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204143.94795: variable 'network_provider' from source: set_fact 12755 1727204143.94799: Evaluated conditional (network_provider == "initscripts"): False 12755 1727204143.94802: when evaluation is False, skipping this task 12755 1727204143.94805: _execute() done 12755 1727204143.94808: dumping result to json 12755 1727204143.94815: done dumping result, returning 12755 1727204143.94818: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-72e9-1a19-00000000017a] 12755 1727204143.94821: sending task result for task 12b410aa-8751-72e9-1a19-00000000017a 12755 1727204143.95007: done sending task result for task 12b410aa-8751-72e9-1a19-00000000017a 12755 1727204143.95013: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12755 1727204143.95064: no more pending results, returning what we have 12755 1727204143.95069: results queue empty 12755 1727204143.95070: checking for any_errors_fatal 12755 1727204143.95078: done checking for any_errors_fatal 12755 1727204143.95079: checking for max_fail_percentage 12755 1727204143.95081: done checking for max_fail_percentage 12755 1727204143.95082: checking to see if all hosts have failed and the running result is not ok 12755 1727204143.95084: done checking to see if all hosts have failed 12755 1727204143.95085: getting the remaining hosts for this loop 12755 1727204143.95087: done getting the remaining hosts for this loop 12755 1727204143.95093: getting the next task for host managed-node1 12755 1727204143.95103: done getting next task for host managed-node1 12755 1727204143.95108: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12755 1727204143.95116: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204143.95147: getting variables 12755 1727204143.95150: in VariableManager get_vars() 12755 1727204143.95617: Calling all_inventory to load vars for managed-node1 12755 1727204143.95621: Calling groups_inventory to load vars for managed-node1 12755 1727204143.95625: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204143.95636: Calling all_plugins_play to load vars for managed-node1 12755 1727204143.95640: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204143.95644: Calling groups_plugins_play to load vars for managed-node1 12755 1727204144.00742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204144.07361: done with get_vars() 12755 1727204144.07531: done getting variables 12755 1727204144.07716: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:55:44 -0400 (0:00:00.161) 0:01:09.314 ***** 12755 1727204144.07882: entering _queue_task() for managed-node1/copy 12755 1727204144.08653: worker is 1 (out of 1 available) 12755 1727204144.08668: exiting _queue_task() for managed-node1/copy 12755 1727204144.08682: done queuing things up, now waiting for results queue to drain 12755 1727204144.08684: waiting for pending results... 12755 1727204144.09208: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12755 1727204144.09486: in run() - task 12b410aa-8751-72e9-1a19-00000000017b 12755 1727204144.09795: variable 'ansible_search_path' from source: unknown 12755 1727204144.09799: variable 'ansible_search_path' from source: unknown 12755 1727204144.09802: calling self._execute() 12755 1727204144.09897: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204144.09915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204144.10194: variable 'omit' from source: magic vars 12755 1727204144.10779: variable 'ansible_distribution_major_version' from source: facts 12755 1727204144.11194: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204144.11198: variable 'network_provider' from source: set_fact 12755 1727204144.11201: Evaluated conditional (network_provider == "initscripts"): False 12755 1727204144.11204: when evaluation is False, skipping this task 12755 1727204144.11207: _execute() done 12755 1727204144.11212: dumping result to json 12755 1727204144.11214: done dumping result, returning 12755 1727204144.11218: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-72e9-1a19-00000000017b] 12755 1727204144.11401: sending task result for task 12b410aa-8751-72e9-1a19-00000000017b skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12755 1727204144.11655: no more pending results, returning what we have 12755 1727204144.11659: results queue empty 12755 1727204144.11660: checking for any_errors_fatal 12755 1727204144.11667: done checking for any_errors_fatal 12755 1727204144.11668: checking for max_fail_percentage 12755 1727204144.11670: done checking for max_fail_percentage 12755 1727204144.11671: checking to see if all hosts have failed and the running result is not ok 12755 1727204144.11672: done checking to see if all hosts have failed 12755 1727204144.11673: getting the remaining hosts for this loop 12755 1727204144.11675: done getting the remaining hosts for this loop 12755 1727204144.11680: getting the next task for host managed-node1 12755 1727204144.11691: done getting next task for host managed-node1 12755 1727204144.11695: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12755 1727204144.11700: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204144.11731: getting variables 12755 1727204144.11733: in VariableManager get_vars() 12755 1727204144.11921: Calling all_inventory to load vars for managed-node1 12755 1727204144.11925: Calling groups_inventory to load vars for managed-node1 12755 1727204144.11928: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204144.11945: Calling all_plugins_play to load vars for managed-node1 12755 1727204144.11949: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204144.11954: Calling groups_plugins_play to load vars for managed-node1 12755 1727204144.12539: done sending task result for task 12b410aa-8751-72e9-1a19-00000000017b 12755 1727204144.12544: WORKER PROCESS EXITING 12755 1727204144.16752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204144.23671: done with get_vars() 12755 1727204144.23713: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:55:44 -0400 (0:00:00.162) 0:01:09.476 ***** 12755 1727204144.24104: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 12755 1727204144.24881: worker is 1 (out of 1 available) 12755 1727204144.24898: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 12755 1727204144.24914: done queuing things up, now waiting for results queue to drain 12755 1727204144.24916: waiting for pending results... 12755 1727204144.25318: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12755 1727204144.25780: in run() - task 12b410aa-8751-72e9-1a19-00000000017c 12755 1727204144.25804: variable 'ansible_search_path' from source: unknown 12755 1727204144.25816: variable 'ansible_search_path' from source: unknown 12755 1727204144.26038: calling self._execute() 12755 1727204144.26164: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204144.26179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204144.26494: variable 'omit' from source: magic vars 12755 1727204144.27073: variable 'ansible_distribution_major_version' from source: facts 12755 1727204144.27494: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204144.27498: variable 'omit' from source: magic vars 12755 1727204144.27500: variable 'omit' from source: magic vars 12755 1727204144.27820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12755 1727204144.32958: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12755 1727204144.33060: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12755 1727204144.33246: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12755 1727204144.33295: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12755 1727204144.33335: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12755 1727204144.33578: variable 'network_provider' from source: set_fact 12755 1727204144.33743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12755 1727204144.33777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12755 1727204144.34032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12755 1727204144.34083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12755 1727204144.34102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12755 1727204144.34694: variable 'omit' from source: magic vars 12755 1727204144.34698: variable 'omit' from source: magic vars 12755 1727204144.34988: variable 'network_connections' from source: task vars 12755 1727204144.35014: variable 'controller_profile' from source: play vars 12755 1727204144.35094: variable 'controller_profile' from source: play vars 12755 1727204144.35471: variable 'omit' from source: magic vars 12755 1727204144.35894: variable '__lsr_ansible_managed' from source: task vars 12755 1727204144.35898: variable '__lsr_ansible_managed' from source: task vars 12755 1727204144.36206: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 12755 1727204144.37095: Loaded config def from plugin (lookup/template) 12755 1727204144.37099: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 12755 1727204144.37102: File lookup term: get_ansible_managed.j2 12755 1727204144.37105: variable 'ansible_search_path' from source: unknown 12755 1727204144.37108: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 12755 1727204144.37115: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 12755 1727204144.37118: variable 'ansible_search_path' from source: unknown 12755 1727204144.56090: variable 'ansible_managed' from source: unknown 12755 1727204144.56648: variable 'omit' from source: magic vars 12755 1727204144.56683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204144.56841: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204144.56861: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204144.56884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204144.57029: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204144.57064: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204144.57068: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204144.57074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204144.57398: Set connection var ansible_connection to ssh 12755 1727204144.57407: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204144.57410: Set connection var ansible_shell_type to sh 12755 1727204144.57429: Set connection var ansible_timeout to 10 12755 1727204144.57437: Set connection var ansible_shell_executable to /bin/sh 12755 1727204144.57446: Set connection var ansible_pipelining to False 12755 1727204144.57537: variable 'ansible_shell_executable' from source: unknown 12755 1727204144.57542: variable 'ansible_connection' from source: unknown 12755 1727204144.57545: variable 'ansible_module_compression' from source: unknown 12755 1727204144.57572: variable 'ansible_shell_type' from source: unknown 12755 1727204144.57699: variable 'ansible_shell_executable' from source: unknown 12755 1727204144.57702: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204144.57705: variable 'ansible_pipelining' from source: unknown 12755 1727204144.57707: variable 'ansible_timeout' from source: unknown 12755 1727204144.57709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204144.58278: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204144.58292: variable 'omit' from source: magic vars 12755 1727204144.58387: starting attempt loop 12755 1727204144.58392: running the handler 12755 1727204144.58410: _low_level_execute_command(): starting 12755 1727204144.58421: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204144.60009: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204144.60102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204144.60112: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204144.60231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204144.60238: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204144.60565: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204144.60584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204144.62773: stdout chunk (state=3): >>>/root <<< 12755 1727204144.62853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204144.62881: stderr chunk (state=3): >>><<< 12755 1727204144.62885: stdout chunk (state=3): >>><<< 12755 1727204144.62966: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204144.62971: _low_level_execute_command(): starting 12755 1727204144.62975: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204144.6289372-16789-281190030085692 `" && echo ansible-tmp-1727204144.6289372-16789-281190030085692="` echo /root/.ansible/tmp/ansible-tmp-1727204144.6289372-16789-281190030085692 `" ) && sleep 0' 12755 1727204144.64208: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204144.64368: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 12755 1727204144.64479: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204144.64490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204144.64565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204144.66585: stdout chunk (state=3): >>>ansible-tmp-1727204144.6289372-16789-281190030085692=/root/.ansible/tmp/ansible-tmp-1727204144.6289372-16789-281190030085692 <<< 12755 1727204144.66781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204144.66984: stderr chunk (state=3): >>><<< 12755 1727204144.66992: stdout chunk (state=3): >>><<< 12755 1727204144.67013: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204144.6289372-16789-281190030085692=/root/.ansible/tmp/ansible-tmp-1727204144.6289372-16789-281190030085692 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204144.67066: variable 'ansible_module_compression' from source: unknown 12755 1727204144.67150: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 12755 1727204144.67154: variable 'ansible_facts' from source: unknown 12755 1727204144.67595: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204144.6289372-16789-281190030085692/AnsiballZ_network_connections.py 12755 1727204144.67997: Sending initial data 12755 1727204144.68000: Sent initial data (168 bytes) 12755 1727204144.69099: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204144.69110: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204144.69127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204144.69197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204144.69200: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204144.69203: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204144.69209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204144.69397: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204144.69417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204144.69439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204144.69452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204144.69555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204144.71287: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204144.71343: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204144.71474: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpywm2mgo_ /root/.ansible/tmp/ansible-tmp-1727204144.6289372-16789-281190030085692/AnsiballZ_network_connections.py <<< 12755 1727204144.71480: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204144.6289372-16789-281190030085692/AnsiballZ_network_connections.py" <<< 12755 1727204144.71496: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpywm2mgo_" to remote "/root/.ansible/tmp/ansible-tmp-1727204144.6289372-16789-281190030085692/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204144.6289372-16789-281190030085692/AnsiballZ_network_connections.py" <<< 12755 1727204144.74986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204144.75223: stderr chunk (state=3): >>><<< 12755 1727204144.75226: stdout chunk (state=3): >>><<< 12755 1727204144.75254: done transferring module to remote 12755 1727204144.75268: _low_level_execute_command(): starting 12755 1727204144.75303: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204144.6289372-16789-281190030085692/ /root/.ansible/tmp/ansible-tmp-1727204144.6289372-16789-281190030085692/AnsiballZ_network_connections.py && sleep 0' 12755 1727204144.77182: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204144.77293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204144.77366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204144.77621: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204144.77725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204144.77793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204144.79897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204144.79901: stderr chunk (state=3): >>><<< 12755 1727204144.79903: stdout chunk (state=3): >>><<< 12755 1727204144.79980: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204144.79983: _low_level_execute_command(): starting 12755 1727204144.79986: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204144.6289372-16789-281190030085692/AnsiballZ_network_connections.py && sleep 0' 12755 1727204144.81306: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204144.81533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204144.81555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204144.81639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204145.25127: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_2g801ubv/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_2g801ubv/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/03d70ce0-ddeb-47d0-bf95-c6e32f95cb44: error=unknown <<< 12755 1727204145.25143: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 12755 1727204145.27297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204145.27301: stderr chunk (state=3): >>>Shared connection to 10.31.11.210 closed. <<< 12755 1727204145.27568: stderr chunk (state=3): >>><<< 12755 1727204145.27571: stdout chunk (state=3): >>><<< 12755 1727204145.27595: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_2g801ubv/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_2g801ubv/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/03d70ce0-ddeb-47d0-bf95-c6e32f95cb44: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204145.27764: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'down', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204144.6289372-16789-281190030085692/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204145.27781: _low_level_execute_command(): starting 12755 1727204145.27785: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204144.6289372-16789-281190030085692/ > /dev/null 2>&1 && sleep 0' 12755 1727204145.29420: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204145.29437: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204145.29502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204145.29674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204145.29732: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204145.30256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204145.32329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204145.32334: stdout chunk (state=3): >>><<< 12755 1727204145.32336: stderr chunk (state=3): >>><<< 12755 1727204145.32339: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204145.32342: handler run complete 12755 1727204145.32604: attempt loop complete, returning result 12755 1727204145.32608: _execute() done 12755 1727204145.32611: dumping result to json 12755 1727204145.32613: done dumping result, returning 12755 1727204145.32615: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-72e9-1a19-00000000017c] 12755 1727204145.32622: sending task result for task 12b410aa-8751-72e9-1a19-00000000017c changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 12755 1727204145.33069: no more pending results, returning what we have 12755 1727204145.33073: results queue empty 12755 1727204145.33075: checking for any_errors_fatal 12755 1727204145.33082: done checking for any_errors_fatal 12755 1727204145.33083: checking for max_fail_percentage 12755 1727204145.33085: done checking for max_fail_percentage 12755 1727204145.33086: checking to see if all hosts have failed and the running result is not ok 12755 1727204145.33087: done checking to see if all hosts have failed 12755 1727204145.33088: getting the remaining hosts for this loop 12755 1727204145.33258: done getting the remaining hosts for this loop 12755 1727204145.33264: getting the next task for host managed-node1 12755 1727204145.33273: done getting next task for host managed-node1 12755 1727204145.33277: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12755 1727204145.33282: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204145.33365: done sending task result for task 12b410aa-8751-72e9-1a19-00000000017c 12755 1727204145.33369: WORKER PROCESS EXITING 12755 1727204145.33382: getting variables 12755 1727204145.33384: in VariableManager get_vars() 12755 1727204145.33447: Calling all_inventory to load vars for managed-node1 12755 1727204145.33451: Calling groups_inventory to load vars for managed-node1 12755 1727204145.33454: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204145.33468: Calling all_plugins_play to load vars for managed-node1 12755 1727204145.33472: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204145.33476: Calling groups_plugins_play to load vars for managed-node1 12755 1727204145.39352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204145.47709: done with get_vars() 12755 1727204145.47761: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:55:45 -0400 (0:00:01.244) 0:01:10.721 ***** 12755 1727204145.48783: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 12755 1727204145.50391: worker is 1 (out of 1 available) 12755 1727204145.50407: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 12755 1727204145.50712: done queuing things up, now waiting for results queue to drain 12755 1727204145.50714: waiting for pending results... 12755 1727204145.51375: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 12755 1727204145.52069: in run() - task 12b410aa-8751-72e9-1a19-00000000017d 12755 1727204145.52558: variable 'ansible_search_path' from source: unknown 12755 1727204145.52562: variable 'ansible_search_path' from source: unknown 12755 1727204145.52565: calling self._execute() 12755 1727204145.53002: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204145.53401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204145.53405: variable 'omit' from source: magic vars 12755 1727204145.54307: variable 'ansible_distribution_major_version' from source: facts 12755 1727204145.54328: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204145.54713: variable 'network_state' from source: role '' defaults 12755 1727204145.54794: Evaluated conditional (network_state != {}): False 12755 1727204145.54798: when evaluation is False, skipping this task 12755 1727204145.54800: _execute() done 12755 1727204145.54802: dumping result to json 12755 1727204145.54805: done dumping result, returning 12755 1727204145.54808: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-72e9-1a19-00000000017d] 12755 1727204145.54811: sending task result for task 12b410aa-8751-72e9-1a19-00000000017d skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12755 1727204145.55113: no more pending results, returning what we have 12755 1727204145.55117: results queue empty 12755 1727204145.55119: checking for any_errors_fatal 12755 1727204145.55131: done checking for any_errors_fatal 12755 1727204145.55132: checking for max_fail_percentage 12755 1727204145.55135: done checking for max_fail_percentage 12755 1727204145.55136: checking to see if all hosts have failed and the running result is not ok 12755 1727204145.55137: done checking to see if all hosts have failed 12755 1727204145.55138: getting the remaining hosts for this loop 12755 1727204145.55139: done getting the remaining hosts for this loop 12755 1727204145.55145: getting the next task for host managed-node1 12755 1727204145.55157: done getting next task for host managed-node1 12755 1727204145.55162: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12755 1727204145.55168: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204145.55201: getting variables 12755 1727204145.55203: in VariableManager get_vars() 12755 1727204145.55271: Calling all_inventory to load vars for managed-node1 12755 1727204145.55274: Calling groups_inventory to load vars for managed-node1 12755 1727204145.55277: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204145.55696: Calling all_plugins_play to load vars for managed-node1 12755 1727204145.55702: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204145.55708: Calling groups_plugins_play to load vars for managed-node1 12755 1727204145.56467: done sending task result for task 12b410aa-8751-72e9-1a19-00000000017d 12755 1727204145.56470: WORKER PROCESS EXITING 12755 1727204145.61868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204145.68872: done with get_vars() 12755 1727204145.69099: done getting variables 12755 1727204145.69175: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:55:45 -0400 (0:00:00.206) 0:01:10.928 ***** 12755 1727204145.69232: entering _queue_task() for managed-node1/debug 12755 1727204145.69894: worker is 1 (out of 1 available) 12755 1727204145.69950: exiting _queue_task() for managed-node1/debug 12755 1727204145.69963: done queuing things up, now waiting for results queue to drain 12755 1727204145.69965: waiting for pending results... 12755 1727204145.70385: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12755 1727204145.70590: in run() - task 12b410aa-8751-72e9-1a19-00000000017e 12755 1727204145.70610: variable 'ansible_search_path' from source: unknown 12755 1727204145.70615: variable 'ansible_search_path' from source: unknown 12755 1727204145.70663: calling self._execute() 12755 1727204145.70786: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204145.70796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204145.70808: variable 'omit' from source: magic vars 12755 1727204145.71610: variable 'ansible_distribution_major_version' from source: facts 12755 1727204145.71663: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204145.71667: variable 'omit' from source: magic vars 12755 1727204145.71883: variable 'omit' from source: magic vars 12755 1727204145.72061: variable 'omit' from source: magic vars 12755 1727204145.72069: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204145.72108: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204145.72149: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204145.72199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204145.72303: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204145.72309: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204145.72312: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204145.72314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204145.72628: Set connection var ansible_connection to ssh 12755 1727204145.72633: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204145.72635: Set connection var ansible_shell_type to sh 12755 1727204145.72640: Set connection var ansible_timeout to 10 12755 1727204145.72643: Set connection var ansible_shell_executable to /bin/sh 12755 1727204145.72716: Set connection var ansible_pipelining to False 12755 1727204145.72721: variable 'ansible_shell_executable' from source: unknown 12755 1727204145.72726: variable 'ansible_connection' from source: unknown 12755 1727204145.72755: variable 'ansible_module_compression' from source: unknown 12755 1727204145.72759: variable 'ansible_shell_type' from source: unknown 12755 1727204145.72762: variable 'ansible_shell_executable' from source: unknown 12755 1727204145.72846: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204145.72849: variable 'ansible_pipelining' from source: unknown 12755 1727204145.72852: variable 'ansible_timeout' from source: unknown 12755 1727204145.72854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204145.73101: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204145.73105: variable 'omit' from source: magic vars 12755 1727204145.73108: starting attempt loop 12755 1727204145.73110: running the handler 12755 1727204145.73281: variable '__network_connections_result' from source: set_fact 12755 1727204145.73432: handler run complete 12755 1727204145.73436: attempt loop complete, returning result 12755 1727204145.73439: _execute() done 12755 1727204145.73441: dumping result to json 12755 1727204145.73443: done dumping result, returning 12755 1727204145.73446: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-72e9-1a19-00000000017e] 12755 1727204145.73448: sending task result for task 12b410aa-8751-72e9-1a19-00000000017e ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "" ] } 12755 1727204145.73697: no more pending results, returning what we have 12755 1727204145.73701: results queue empty 12755 1727204145.73707: checking for any_errors_fatal 12755 1727204145.73716: done checking for any_errors_fatal 12755 1727204145.73718: checking for max_fail_percentage 12755 1727204145.73720: done checking for max_fail_percentage 12755 1727204145.73721: checking to see if all hosts have failed and the running result is not ok 12755 1727204145.73722: done checking to see if all hosts have failed 12755 1727204145.73723: getting the remaining hosts for this loop 12755 1727204145.73725: done getting the remaining hosts for this loop 12755 1727204145.73730: getting the next task for host managed-node1 12755 1727204145.73739: done getting next task for host managed-node1 12755 1727204145.73743: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12755 1727204145.73749: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204145.73764: getting variables 12755 1727204145.73766: in VariableManager get_vars() 12755 1727204145.74038: Calling all_inventory to load vars for managed-node1 12755 1727204145.74042: Calling groups_inventory to load vars for managed-node1 12755 1727204145.74046: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204145.74081: Calling all_plugins_play to load vars for managed-node1 12755 1727204145.74087: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204145.74094: Calling groups_plugins_play to load vars for managed-node1 12755 1727204145.74623: done sending task result for task 12b410aa-8751-72e9-1a19-00000000017e 12755 1727204145.74627: WORKER PROCESS EXITING 12755 1727204145.79482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204145.85177: done with get_vars() 12755 1727204145.85259: done getting variables 12755 1727204145.85344: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:55:45 -0400 (0:00:00.161) 0:01:11.089 ***** 12755 1727204145.85386: entering _queue_task() for managed-node1/debug 12755 1727204145.85961: worker is 1 (out of 1 available) 12755 1727204145.86206: exiting _queue_task() for managed-node1/debug 12755 1727204145.86224: done queuing things up, now waiting for results queue to drain 12755 1727204145.86226: waiting for pending results... 12755 1727204145.86473: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12755 1727204145.86850: in run() - task 12b410aa-8751-72e9-1a19-00000000017f 12755 1727204145.86998: variable 'ansible_search_path' from source: unknown 12755 1727204145.87001: variable 'ansible_search_path' from source: unknown 12755 1727204145.87101: calling self._execute() 12755 1727204145.87287: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204145.87407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204145.87441: variable 'omit' from source: magic vars 12755 1727204145.88545: variable 'ansible_distribution_major_version' from source: facts 12755 1727204145.88608: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204145.88629: variable 'omit' from source: magic vars 12755 1727204145.88739: variable 'omit' from source: magic vars 12755 1727204145.88816: variable 'omit' from source: magic vars 12755 1727204145.88997: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204145.89057: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204145.89144: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204145.89296: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204145.89300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204145.89302: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204145.89437: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204145.89440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204145.89703: Set connection var ansible_connection to ssh 12755 1727204145.89916: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204145.89919: Set connection var ansible_shell_type to sh 12755 1727204145.89922: Set connection var ansible_timeout to 10 12755 1727204145.89926: Set connection var ansible_shell_executable to /bin/sh 12755 1727204145.89929: Set connection var ansible_pipelining to False 12755 1727204145.89932: variable 'ansible_shell_executable' from source: unknown 12755 1727204145.89934: variable 'ansible_connection' from source: unknown 12755 1727204145.89937: variable 'ansible_module_compression' from source: unknown 12755 1727204145.89939: variable 'ansible_shell_type' from source: unknown 12755 1727204145.89942: variable 'ansible_shell_executable' from source: unknown 12755 1727204145.89944: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204145.89946: variable 'ansible_pipelining' from source: unknown 12755 1727204145.89948: variable 'ansible_timeout' from source: unknown 12755 1727204145.90009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204145.90429: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204145.90696: variable 'omit' from source: magic vars 12755 1727204145.90718: starting attempt loop 12755 1727204145.90739: running the handler 12755 1727204145.90939: variable '__network_connections_result' from source: set_fact 12755 1727204145.91129: variable '__network_connections_result' from source: set_fact 12755 1727204145.91609: handler run complete 12755 1727204145.91615: attempt loop complete, returning result 12755 1727204145.91618: _execute() done 12755 1727204145.91621: dumping result to json 12755 1727204145.91624: done dumping result, returning 12755 1727204145.91627: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-72e9-1a19-00000000017f] 12755 1727204145.91630: sending task result for task 12b410aa-8751-72e9-1a19-00000000017f 12755 1727204145.91959: done sending task result for task 12b410aa-8751-72e9-1a19-00000000017f 12755 1727204145.91963: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 12755 1727204145.92080: no more pending results, returning what we have 12755 1727204145.92084: results queue empty 12755 1727204145.92086: checking for any_errors_fatal 12755 1727204145.92092: done checking for any_errors_fatal 12755 1727204145.92093: checking for max_fail_percentage 12755 1727204145.92095: done checking for max_fail_percentage 12755 1727204145.92096: checking to see if all hosts have failed and the running result is not ok 12755 1727204145.92098: done checking to see if all hosts have failed 12755 1727204145.92098: getting the remaining hosts for this loop 12755 1727204145.92100: done getting the remaining hosts for this loop 12755 1727204145.92105: getting the next task for host managed-node1 12755 1727204145.92114: done getting next task for host managed-node1 12755 1727204145.92120: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12755 1727204145.92125: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204145.92140: getting variables 12755 1727204145.92142: in VariableManager get_vars() 12755 1727204145.92517: Calling all_inventory to load vars for managed-node1 12755 1727204145.92522: Calling groups_inventory to load vars for managed-node1 12755 1727204145.92526: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204145.92541: Calling all_plugins_play to load vars for managed-node1 12755 1727204145.92545: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204145.92550: Calling groups_plugins_play to load vars for managed-node1 12755 1727204145.95496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204145.99317: done with get_vars() 12755 1727204145.99377: done getting variables 12755 1727204145.99466: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:55:45 -0400 (0:00:00.141) 0:01:11.230 ***** 12755 1727204145.99516: entering _queue_task() for managed-node1/debug 12755 1727204145.99939: worker is 1 (out of 1 available) 12755 1727204145.99955: exiting _queue_task() for managed-node1/debug 12755 1727204145.99971: done queuing things up, now waiting for results queue to drain 12755 1727204146.00097: waiting for pending results... 12755 1727204146.00594: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12755 1727204146.00788: in run() - task 12b410aa-8751-72e9-1a19-000000000180 12755 1727204146.01121: variable 'ansible_search_path' from source: unknown 12755 1727204146.01125: variable 'ansible_search_path' from source: unknown 12755 1727204146.01128: calling self._execute() 12755 1727204146.01382: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204146.01411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204146.01431: variable 'omit' from source: magic vars 12755 1727204146.02404: variable 'ansible_distribution_major_version' from source: facts 12755 1727204146.02491: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204146.02841: variable 'network_state' from source: role '' defaults 12755 1727204146.02863: Evaluated conditional (network_state != {}): False 12755 1727204146.02872: when evaluation is False, skipping this task 12755 1727204146.02880: _execute() done 12755 1727204146.02923: dumping result to json 12755 1727204146.02931: done dumping result, returning 12755 1727204146.02944: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-72e9-1a19-000000000180] 12755 1727204146.02954: sending task result for task 12b410aa-8751-72e9-1a19-000000000180 12755 1727204146.03412: done sending task result for task 12b410aa-8751-72e9-1a19-000000000180 12755 1727204146.03415: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 12755 1727204146.03484: no more pending results, returning what we have 12755 1727204146.03488: results queue empty 12755 1727204146.03492: checking for any_errors_fatal 12755 1727204146.03506: done checking for any_errors_fatal 12755 1727204146.03508: checking for max_fail_percentage 12755 1727204146.03510: done checking for max_fail_percentage 12755 1727204146.03511: checking to see if all hosts have failed and the running result is not ok 12755 1727204146.03512: done checking to see if all hosts have failed 12755 1727204146.03513: getting the remaining hosts for this loop 12755 1727204146.03516: done getting the remaining hosts for this loop 12755 1727204146.03522: getting the next task for host managed-node1 12755 1727204146.03539: done getting next task for host managed-node1 12755 1727204146.03546: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12755 1727204146.03553: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204146.03582: getting variables 12755 1727204146.03585: in VariableManager get_vars() 12755 1727204146.03809: Calling all_inventory to load vars for managed-node1 12755 1727204146.03813: Calling groups_inventory to load vars for managed-node1 12755 1727204146.03817: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204146.03833: Calling all_plugins_play to load vars for managed-node1 12755 1727204146.03838: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204146.03843: Calling groups_plugins_play to load vars for managed-node1 12755 1727204146.09073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204146.15410: done with get_vars() 12755 1727204146.15463: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:55:46 -0400 (0:00:00.163) 0:01:11.393 ***** 12755 1727204146.15828: entering _queue_task() for managed-node1/ping 12755 1727204146.16647: worker is 1 (out of 1 available) 12755 1727204146.16661: exiting _queue_task() for managed-node1/ping 12755 1727204146.16674: done queuing things up, now waiting for results queue to drain 12755 1727204146.16676: waiting for pending results... 12755 1727204146.17169: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 12755 1727204146.17702: in run() - task 12b410aa-8751-72e9-1a19-000000000181 12755 1727204146.17728: variable 'ansible_search_path' from source: unknown 12755 1727204146.17732: variable 'ansible_search_path' from source: unknown 12755 1727204146.17947: calling self._execute() 12755 1727204146.17952: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204146.17955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204146.17957: variable 'omit' from source: magic vars 12755 1727204146.18487: variable 'ansible_distribution_major_version' from source: facts 12755 1727204146.18597: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204146.18601: variable 'omit' from source: magic vars 12755 1727204146.18614: variable 'omit' from source: magic vars 12755 1727204146.18866: variable 'omit' from source: magic vars 12755 1727204146.19068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204146.19146: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204146.19175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204146.19299: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204146.19323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204146.19466: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204146.19470: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204146.19472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204146.19539: Set connection var ansible_connection to ssh 12755 1727204146.19555: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204146.19563: Set connection var ansible_shell_type to sh 12755 1727204146.19594: Set connection var ansible_timeout to 10 12755 1727204146.19612: Set connection var ansible_shell_executable to /bin/sh 12755 1727204146.19626: Set connection var ansible_pipelining to False 12755 1727204146.19657: variable 'ansible_shell_executable' from source: unknown 12755 1727204146.19665: variable 'ansible_connection' from source: unknown 12755 1727204146.19682: variable 'ansible_module_compression' from source: unknown 12755 1727204146.19686: variable 'ansible_shell_type' from source: unknown 12755 1727204146.19793: variable 'ansible_shell_executable' from source: unknown 12755 1727204146.19799: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204146.19802: variable 'ansible_pipelining' from source: unknown 12755 1727204146.19804: variable 'ansible_timeout' from source: unknown 12755 1727204146.19807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204146.19982: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12755 1727204146.20015: variable 'omit' from source: magic vars 12755 1727204146.20028: starting attempt loop 12755 1727204146.20036: running the handler 12755 1727204146.20056: _low_level_execute_command(): starting 12755 1727204146.20069: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204146.20999: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204146.21005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204146.21066: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204146.21186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204146.21241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204146.23138: stdout chunk (state=3): >>>/root <<< 12755 1727204146.23381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204146.23426: stdout chunk (state=3): >>><<< 12755 1727204146.23430: stderr chunk (state=3): >>><<< 12755 1727204146.23458: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204146.23488: _low_level_execute_command(): starting 12755 1727204146.23508: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204146.2346733-16903-5585772248013 `" && echo ansible-tmp-1727204146.2346733-16903-5585772248013="` echo /root/.ansible/tmp/ansible-tmp-1727204146.2346733-16903-5585772248013 `" ) && sleep 0' 12755 1727204146.24347: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204146.24422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204146.24525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204146.26623: stdout chunk (state=3): >>>ansible-tmp-1727204146.2346733-16903-5585772248013=/root/.ansible/tmp/ansible-tmp-1727204146.2346733-16903-5585772248013 <<< 12755 1727204146.26908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204146.26947: stderr chunk (state=3): >>><<< 12755 1727204146.27096: stdout chunk (state=3): >>><<< 12755 1727204146.27101: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204146.2346733-16903-5585772248013=/root/.ansible/tmp/ansible-tmp-1727204146.2346733-16903-5585772248013 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204146.27301: variable 'ansible_module_compression' from source: unknown 12755 1727204146.27305: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 12755 1727204146.27307: variable 'ansible_facts' from source: unknown 12755 1727204146.27396: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204146.2346733-16903-5585772248013/AnsiballZ_ping.py 12755 1727204146.27909: Sending initial data 12755 1727204146.27920: Sent initial data (151 bytes) 12755 1727204146.29340: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204146.29555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204146.29596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204146.31383: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204146.31447: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204146.31541: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpxuo16lo7 /root/.ansible/tmp/ansible-tmp-1727204146.2346733-16903-5585772248013/AnsiballZ_ping.py <<< 12755 1727204146.31545: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204146.2346733-16903-5585772248013/AnsiballZ_ping.py" <<< 12755 1727204146.31582: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpxuo16lo7" to remote "/root/.ansible/tmp/ansible-tmp-1727204146.2346733-16903-5585772248013/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204146.2346733-16903-5585772248013/AnsiballZ_ping.py" <<< 12755 1727204146.32820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204146.32886: stderr chunk (state=3): >>><<< 12755 1727204146.32890: stdout chunk (state=3): >>><<< 12755 1727204146.32919: done transferring module to remote 12755 1727204146.32930: _low_level_execute_command(): starting 12755 1727204146.32936: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204146.2346733-16903-5585772248013/ /root/.ansible/tmp/ansible-tmp-1727204146.2346733-16903-5585772248013/AnsiballZ_ping.py && sleep 0' 12755 1727204146.33874: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204146.33892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204146.33912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204146.34004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204146.34034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204146.34068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204146.34100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204146.34166: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204146.36178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204146.36231: stderr chunk (state=3): >>><<< 12755 1727204146.36240: stdout chunk (state=3): >>><<< 12755 1727204146.36255: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204146.36263: _low_level_execute_command(): starting 12755 1727204146.36266: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204146.2346733-16903-5585772248013/AnsiballZ_ping.py && sleep 0' 12755 1727204146.36728: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204146.36732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204146.36735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12755 1727204146.36737: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204146.36805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204146.36811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204146.36845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204146.54833: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 12755 1727204146.56299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204146.56303: stderr chunk (state=3): >>><<< 12755 1727204146.56305: stdout chunk (state=3): >>><<< 12755 1727204146.56308: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204146.56352: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204146.2346733-16903-5585772248013/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204146.56357: _low_level_execute_command(): starting 12755 1727204146.56360: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204146.2346733-16903-5585772248013/ > /dev/null 2>&1 && sleep 0' 12755 1727204146.57598: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204146.57602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204146.57606: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204146.57608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204146.57807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204146.57846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204146.57853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204146.58035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204146.59950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204146.59956: stderr chunk (state=3): >>><<< 12755 1727204146.59961: stdout chunk (state=3): >>><<< 12755 1727204146.59986: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204146.59995: handler run complete 12755 1727204146.60025: attempt loop complete, returning result 12755 1727204146.60028: _execute() done 12755 1727204146.60031: dumping result to json 12755 1727204146.60033: done dumping result, returning 12755 1727204146.60044: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-72e9-1a19-000000000181] 12755 1727204146.60049: sending task result for task 12b410aa-8751-72e9-1a19-000000000181 12755 1727204146.60306: done sending task result for task 12b410aa-8751-72e9-1a19-000000000181 12755 1727204146.60311: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 12755 1727204146.60406: no more pending results, returning what we have 12755 1727204146.60413: results queue empty 12755 1727204146.60414: checking for any_errors_fatal 12755 1727204146.60427: done checking for any_errors_fatal 12755 1727204146.60428: checking for max_fail_percentage 12755 1727204146.60430: done checking for max_fail_percentage 12755 1727204146.60431: checking to see if all hosts have failed and the running result is not ok 12755 1727204146.60433: done checking to see if all hosts have failed 12755 1727204146.60434: getting the remaining hosts for this loop 12755 1727204146.60435: done getting the remaining hosts for this loop 12755 1727204146.60441: getting the next task for host managed-node1 12755 1727204146.60455: done getting next task for host managed-node1 12755 1727204146.60458: ^ task is: TASK: meta (role_complete) 12755 1727204146.60463: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204146.60480: getting variables 12755 1727204146.60482: in VariableManager get_vars() 12755 1727204146.60855: Calling all_inventory to load vars for managed-node1 12755 1727204146.60859: Calling groups_inventory to load vars for managed-node1 12755 1727204146.60863: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204146.60878: Calling all_plugins_play to load vars for managed-node1 12755 1727204146.60882: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204146.60887: Calling groups_plugins_play to load vars for managed-node1 12755 1727204146.65168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204146.71850: done with get_vars() 12755 1727204146.71897: done getting variables 12755 1727204146.72074: done queuing things up, now waiting for results queue to drain 12755 1727204146.72077: results queue empty 12755 1727204146.72078: checking for any_errors_fatal 12755 1727204146.72082: done checking for any_errors_fatal 12755 1727204146.72083: checking for max_fail_percentage 12755 1727204146.72085: done checking for max_fail_percentage 12755 1727204146.72086: checking to see if all hosts have failed and the running result is not ok 12755 1727204146.72087: done checking to see if all hosts have failed 12755 1727204146.72088: getting the remaining hosts for this loop 12755 1727204146.72092: done getting the remaining hosts for this loop 12755 1727204146.72096: getting the next task for host managed-node1 12755 1727204146.72102: done getting next task for host managed-node1 12755 1727204146.72104: ^ task is: TASK: Delete the device '{{ controller_device }}' 12755 1727204146.72107: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204146.72113: getting variables 12755 1727204146.72114: in VariableManager get_vars() 12755 1727204146.72197: Calling all_inventory to load vars for managed-node1 12755 1727204146.72201: Calling groups_inventory to load vars for managed-node1 12755 1727204146.72204: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204146.72213: Calling all_plugins_play to load vars for managed-node1 12755 1727204146.72217: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204146.72220: Calling groups_plugins_play to load vars for managed-node1 12755 1727204146.75488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204146.81217: done with get_vars() 12755 1727204146.81357: done getting variables 12755 1727204146.81421: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12755 1727204146.81778: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:242 Tuesday 24 September 2024 14:55:46 -0400 (0:00:00.659) 0:01:12.053 ***** 12755 1727204146.81819: entering _queue_task() for managed-node1/command 12755 1727204146.82494: worker is 1 (out of 1 available) 12755 1727204146.82507: exiting _queue_task() for managed-node1/command 12755 1727204146.82521: done queuing things up, now waiting for results queue to drain 12755 1727204146.82522: waiting for pending results... 12755 1727204146.83534: running TaskExecutor() for managed-node1/TASK: Delete the device 'nm-bond' 12755 1727204146.84124: in run() - task 12b410aa-8751-72e9-1a19-0000000001b1 12755 1727204146.84129: variable 'ansible_search_path' from source: unknown 12755 1727204146.84132: calling self._execute() 12755 1727204146.84792: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204146.84812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204146.84965: variable 'omit' from source: magic vars 12755 1727204146.86195: variable 'ansible_distribution_major_version' from source: facts 12755 1727204146.86199: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204146.86245: variable 'omit' from source: magic vars 12755 1727204146.86277: variable 'omit' from source: magic vars 12755 1727204146.86794: variable 'controller_device' from source: play vars 12755 1727204146.86799: variable 'omit' from source: magic vars 12755 1727204146.87147: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204146.87256: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204146.87259: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204146.87262: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204146.87374: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204146.87422: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204146.87591: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204146.87797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204146.87938: Set connection var ansible_connection to ssh 12755 1727204146.88031: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204146.88080: Set connection var ansible_shell_type to sh 12755 1727204146.88105: Set connection var ansible_timeout to 10 12755 1727204146.88207: Set connection var ansible_shell_executable to /bin/sh 12755 1727204146.88299: Set connection var ansible_pipelining to False 12755 1727204146.88334: variable 'ansible_shell_executable' from source: unknown 12755 1727204146.88345: variable 'ansible_connection' from source: unknown 12755 1727204146.88354: variable 'ansible_module_compression' from source: unknown 12755 1727204146.88362: variable 'ansible_shell_type' from source: unknown 12755 1727204146.88401: variable 'ansible_shell_executable' from source: unknown 12755 1727204146.88414: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204146.88426: variable 'ansible_pipelining' from source: unknown 12755 1727204146.88595: variable 'ansible_timeout' from source: unknown 12755 1727204146.88599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204146.88872: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204146.88959: variable 'omit' from source: magic vars 12755 1727204146.88970: starting attempt loop 12755 1727204146.88977: running the handler 12755 1727204146.89002: _low_level_execute_command(): starting 12755 1727204146.89063: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204146.91037: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204146.91043: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204146.91512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204146.91894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204146.93566: stdout chunk (state=3): >>>/root <<< 12755 1727204146.93806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204146.93809: stdout chunk (state=3): >>><<< 12755 1727204146.93812: stderr chunk (state=3): >>><<< 12755 1727204146.93833: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204146.93855: _low_level_execute_command(): starting 12755 1727204146.93867: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204146.9384067-17039-138313616109536 `" && echo ansible-tmp-1727204146.9384067-17039-138313616109536="` echo /root/.ansible/tmp/ansible-tmp-1727204146.9384067-17039-138313616109536 `" ) && sleep 0' 12755 1727204146.94895: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204146.95205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204146.95307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204146.95422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204146.97522: stdout chunk (state=3): >>>ansible-tmp-1727204146.9384067-17039-138313616109536=/root/.ansible/tmp/ansible-tmp-1727204146.9384067-17039-138313616109536 <<< 12755 1727204146.97663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204146.97805: stderr chunk (state=3): >>><<< 12755 1727204146.97815: stdout chunk (state=3): >>><<< 12755 1727204146.97842: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204146.9384067-17039-138313616109536=/root/.ansible/tmp/ansible-tmp-1727204146.9384067-17039-138313616109536 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204146.97885: variable 'ansible_module_compression' from source: unknown 12755 1727204146.98048: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12755 1727204146.98496: variable 'ansible_facts' from source: unknown 12755 1727204146.98601: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204146.9384067-17039-138313616109536/AnsiballZ_command.py 12755 1727204146.99223: Sending initial data 12755 1727204146.99227: Sent initial data (156 bytes) 12755 1727204147.00907: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204147.01410: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204147.01430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204147.01503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204147.03300: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204147.03342: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204147.03384: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpg8rs9m9_ /root/.ansible/tmp/ansible-tmp-1727204146.9384067-17039-138313616109536/AnsiballZ_command.py <<< 12755 1727204147.03402: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204146.9384067-17039-138313616109536/AnsiballZ_command.py" <<< 12755 1727204147.03451: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpg8rs9m9_" to remote "/root/.ansible/tmp/ansible-tmp-1727204146.9384067-17039-138313616109536/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204146.9384067-17039-138313616109536/AnsiballZ_command.py" <<< 12755 1727204147.05643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204147.05676: stderr chunk (state=3): >>><<< 12755 1727204147.05686: stdout chunk (state=3): >>><<< 12755 1727204147.05724: done transferring module to remote 12755 1727204147.05813: _low_level_execute_command(): starting 12755 1727204147.05826: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204146.9384067-17039-138313616109536/ /root/.ansible/tmp/ansible-tmp-1727204146.9384067-17039-138313616109536/AnsiballZ_command.py && sleep 0' 12755 1727204147.07295: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204147.07316: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204147.07406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204147.07538: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204147.07555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204147.07707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204147.09806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204147.09837: stderr chunk (state=3): >>><<< 12755 1727204147.09847: stdout chunk (state=3): >>><<< 12755 1727204147.09869: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204147.09915: _low_level_execute_command(): starting 12755 1727204147.09928: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204146.9384067-17039-138313616109536/AnsiballZ_command.py && sleep 0' 12755 1727204147.10898: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204147.10917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204147.10937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204147.10959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204147.10976: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204147.11002: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 12755 1727204147.11100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204147.11123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204147.11268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204147.30056: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-24 14:55:47.292153", "end": "2024-09-24 14:55:47.299642", "delta": "0:00:00.007489", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12755 1727204147.31651: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.210 closed. <<< 12755 1727204147.31772: stderr chunk (state=3): >>><<< 12755 1727204147.31775: stdout chunk (state=3): >>><<< 12755 1727204147.31778: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-24 14:55:47.292153", "end": "2024-09-24 14:55:47.299642", "delta": "0:00:00.007489", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.210 closed. 12755 1727204147.31782: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204146.9384067-17039-138313616109536/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204147.31784: _low_level_execute_command(): starting 12755 1727204147.31787: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204146.9384067-17039-138313616109536/ > /dev/null 2>&1 && sleep 0' 12755 1727204147.32250: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204147.32280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204147.32284: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 12755 1727204147.32286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204147.32291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204147.32347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204147.32354: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204147.32401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204147.34695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204147.34699: stdout chunk (state=3): >>><<< 12755 1727204147.34701: stderr chunk (state=3): >>><<< 12755 1727204147.34704: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204147.34713: handler run complete 12755 1727204147.34715: Evaluated conditional (False): False 12755 1727204147.34717: Evaluated conditional (False): False 12755 1727204147.34719: attempt loop complete, returning result 12755 1727204147.34721: _execute() done 12755 1727204147.34723: dumping result to json 12755 1727204147.34725: done dumping result, returning 12755 1727204147.34727: done running TaskExecutor() for managed-node1/TASK: Delete the device 'nm-bond' [12b410aa-8751-72e9-1a19-0000000001b1] 12755 1727204147.34730: sending task result for task 12b410aa-8751-72e9-1a19-0000000001b1 12755 1727204147.34835: done sending task result for task 12b410aa-8751-72e9-1a19-0000000001b1 ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007489", "end": "2024-09-24 14:55:47.299642", "failed_when_result": false, "rc": 1, "start": "2024-09-24 14:55:47.292153" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 12755 1727204147.35046: no more pending results, returning what we have 12755 1727204147.35051: results queue empty 12755 1727204147.35052: checking for any_errors_fatal 12755 1727204147.35055: done checking for any_errors_fatal 12755 1727204147.35056: checking for max_fail_percentage 12755 1727204147.35058: done checking for max_fail_percentage 12755 1727204147.35059: checking to see if all hosts have failed and the running result is not ok 12755 1727204147.35060: done checking to see if all hosts have failed 12755 1727204147.35061: getting the remaining hosts for this loop 12755 1727204147.35063: done getting the remaining hosts for this loop 12755 1727204147.35068: getting the next task for host managed-node1 12755 1727204147.35078: done getting next task for host managed-node1 12755 1727204147.35081: ^ task is: TASK: Remove test interfaces 12755 1727204147.35084: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204147.35091: getting variables 12755 1727204147.35093: in VariableManager get_vars() 12755 1727204147.35160: Calling all_inventory to load vars for managed-node1 12755 1727204147.35163: Calling groups_inventory to load vars for managed-node1 12755 1727204147.35170: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204147.35185: WORKER PROCESS EXITING 12755 1727204147.35462: Calling all_plugins_play to load vars for managed-node1 12755 1727204147.35467: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204147.35471: Calling groups_plugins_play to load vars for managed-node1 12755 1727204147.37495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204147.39111: done with get_vars() 12755 1727204147.39138: done getting variables 12755 1727204147.39195: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:55:47 -0400 (0:00:00.574) 0:01:12.627 ***** 12755 1727204147.39227: entering _queue_task() for managed-node1/shell 12755 1727204147.39512: worker is 1 (out of 1 available) 12755 1727204147.39527: exiting _queue_task() for managed-node1/shell 12755 1727204147.39540: done queuing things up, now waiting for results queue to drain 12755 1727204147.39543: waiting for pending results... 12755 1727204147.39743: running TaskExecutor() for managed-node1/TASK: Remove test interfaces 12755 1727204147.39860: in run() - task 12b410aa-8751-72e9-1a19-0000000001b5 12755 1727204147.39875: variable 'ansible_search_path' from source: unknown 12755 1727204147.39879: variable 'ansible_search_path' from source: unknown 12755 1727204147.39918: calling self._execute() 12755 1727204147.40028: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204147.40038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204147.40067: variable 'omit' from source: magic vars 12755 1727204147.40494: variable 'ansible_distribution_major_version' from source: facts 12755 1727204147.40506: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204147.40514: variable 'omit' from source: magic vars 12755 1727204147.40584: variable 'omit' from source: magic vars 12755 1727204147.40717: variable 'dhcp_interface1' from source: play vars 12755 1727204147.40722: variable 'dhcp_interface2' from source: play vars 12755 1727204147.40740: variable 'omit' from source: magic vars 12755 1727204147.40782: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204147.40818: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204147.40856: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204147.40872: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204147.40918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204147.40935: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204147.40938: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204147.40941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204147.41022: Set connection var ansible_connection to ssh 12755 1727204147.41034: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204147.41037: Set connection var ansible_shell_type to sh 12755 1727204147.41047: Set connection var ansible_timeout to 10 12755 1727204147.41054: Set connection var ansible_shell_executable to /bin/sh 12755 1727204147.41060: Set connection var ansible_pipelining to False 12755 1727204147.41080: variable 'ansible_shell_executable' from source: unknown 12755 1727204147.41083: variable 'ansible_connection' from source: unknown 12755 1727204147.41086: variable 'ansible_module_compression' from source: unknown 12755 1727204147.41092: variable 'ansible_shell_type' from source: unknown 12755 1727204147.41096: variable 'ansible_shell_executable' from source: unknown 12755 1727204147.41099: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204147.41105: variable 'ansible_pipelining' from source: unknown 12755 1727204147.41111: variable 'ansible_timeout' from source: unknown 12755 1727204147.41114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204147.41250: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204147.41261: variable 'omit' from source: magic vars 12755 1727204147.41266: starting attempt loop 12755 1727204147.41269: running the handler 12755 1727204147.41280: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204147.41299: _low_level_execute_command(): starting 12755 1727204147.41306: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204147.41882: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204147.41887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204147.41937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204147.41947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204147.42007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204147.43772: stdout chunk (state=3): >>>/root <<< 12755 1727204147.43927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204147.43980: stderr chunk (state=3): >>><<< 12755 1727204147.43985: stdout chunk (state=3): >>><<< 12755 1727204147.44011: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204147.44023: _low_level_execute_command(): starting 12755 1727204147.44030: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204147.4400954-17101-30406753054000 `" && echo ansible-tmp-1727204147.4400954-17101-30406753054000="` echo /root/.ansible/tmp/ansible-tmp-1727204147.4400954-17101-30406753054000 `" ) && sleep 0' 12755 1727204147.44495: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204147.44503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204147.44536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204147.44579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204147.44582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204147.44641: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204147.46750: stdout chunk (state=3): >>>ansible-tmp-1727204147.4400954-17101-30406753054000=/root/.ansible/tmp/ansible-tmp-1727204147.4400954-17101-30406753054000 <<< 12755 1727204147.46900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204147.46963: stderr chunk (state=3): >>><<< 12755 1727204147.46967: stdout chunk (state=3): >>><<< 12755 1727204147.46984: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204147.4400954-17101-30406753054000=/root/.ansible/tmp/ansible-tmp-1727204147.4400954-17101-30406753054000 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204147.47015: variable 'ansible_module_compression' from source: unknown 12755 1727204147.47072: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12755 1727204147.47120: variable 'ansible_facts' from source: unknown 12755 1727204147.47204: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204147.4400954-17101-30406753054000/AnsiballZ_command.py 12755 1727204147.47376: Sending initial data 12755 1727204147.47380: Sent initial data (155 bytes) 12755 1727204147.47934: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204147.47937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204147.47941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204147.47944: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204147.47946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204147.47999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204147.48002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204147.48045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204147.49745: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204147.49851: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204147.49921: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpqf9la6ky /root/.ansible/tmp/ansible-tmp-1727204147.4400954-17101-30406753054000/AnsiballZ_command.py <<< 12755 1727204147.49925: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204147.4400954-17101-30406753054000/AnsiballZ_command.py" <<< 12755 1727204147.49993: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpqf9la6ky" to remote "/root/.ansible/tmp/ansible-tmp-1727204147.4400954-17101-30406753054000/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204147.4400954-17101-30406753054000/AnsiballZ_command.py" <<< 12755 1727204147.50934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204147.51004: stderr chunk (state=3): >>><<< 12755 1727204147.51010: stdout chunk (state=3): >>><<< 12755 1727204147.51041: done transferring module to remote 12755 1727204147.51056: _low_level_execute_command(): starting 12755 1727204147.51065: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204147.4400954-17101-30406753054000/ /root/.ansible/tmp/ansible-tmp-1727204147.4400954-17101-30406753054000/AnsiballZ_command.py && sleep 0' 12755 1727204147.52104: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204147.52124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204147.52141: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204147.52260: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204147.52381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204147.52456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204147.54527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204147.54549: stderr chunk (state=3): >>><<< 12755 1727204147.54553: stdout chunk (state=3): >>><<< 12755 1727204147.54573: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204147.54577: _low_level_execute_command(): starting 12755 1727204147.54680: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204147.4400954-17101-30406753054000/AnsiballZ_command.py && sleep 0' 12755 1727204147.55280: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204147.55307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204147.55495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204147.55553: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204147.55647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204147.78029: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:55:47.737245", "end": "2024-09-24 14:55:47.778735", "delta": "0:00:00.041490", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12755 1727204147.80845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204147.81136: stderr chunk (state=3): >>>Shared connection to 10.31.11.210 closed. <<< 12755 1727204147.81141: stdout chunk (state=3): >>><<< 12755 1727204147.81144: stderr chunk (state=3): >>><<< 12755 1727204147.81241: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:55:47.737245", "end": "2024-09-24 14:55:47.778735", "delta": "0:00:00.041490", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204147.81245: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204147.4400954-17101-30406753054000/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204147.81253: _low_level_execute_command(): starting 12755 1727204147.81438: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204147.4400954-17101-30406753054000/ > /dev/null 2>&1 && sleep 0' 12755 1727204147.82524: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204147.82554: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204147.82570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204147.82703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204147.82873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204147.82886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204147.83238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204147.85483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204147.85486: stdout chunk (state=3): >>><<< 12755 1727204147.85488: stderr chunk (state=3): >>><<< 12755 1727204147.85795: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204147.85799: handler run complete 12755 1727204147.85801: Evaluated conditional (False): False 12755 1727204147.85804: attempt loop complete, returning result 12755 1727204147.85806: _execute() done 12755 1727204147.85808: dumping result to json 12755 1727204147.85811: done dumping result, returning 12755 1727204147.85813: done running TaskExecutor() for managed-node1/TASK: Remove test interfaces [12b410aa-8751-72e9-1a19-0000000001b5] 12755 1727204147.85815: sending task result for task 12b410aa-8751-72e9-1a19-0000000001b5 12755 1727204147.86338: done sending task result for task 12b410aa-8751-72e9-1a19-0000000001b5 12755 1727204147.86342: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.041490", "end": "2024-09-24 14:55:47.778735", "rc": 0, "start": "2024-09-24 14:55:47.737245" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 12755 1727204147.86451: no more pending results, returning what we have 12755 1727204147.86455: results queue empty 12755 1727204147.86457: checking for any_errors_fatal 12755 1727204147.86471: done checking for any_errors_fatal 12755 1727204147.86472: checking for max_fail_percentage 12755 1727204147.86475: done checking for max_fail_percentage 12755 1727204147.86475: checking to see if all hosts have failed and the running result is not ok 12755 1727204147.86477: done checking to see if all hosts have failed 12755 1727204147.86477: getting the remaining hosts for this loop 12755 1727204147.86479: done getting the remaining hosts for this loop 12755 1727204147.86485: getting the next task for host managed-node1 12755 1727204147.86496: done getting next task for host managed-node1 12755 1727204147.86500: ^ task is: TASK: Stop dnsmasq/radvd services 12755 1727204147.86504: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204147.86510: getting variables 12755 1727204147.86512: in VariableManager get_vars() 12755 1727204147.86582: Calling all_inventory to load vars for managed-node1 12755 1727204147.86587: Calling groups_inventory to load vars for managed-node1 12755 1727204147.87171: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204147.87195: Calling all_plugins_play to load vars for managed-node1 12755 1727204147.87199: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204147.87204: Calling groups_plugins_play to load vars for managed-node1 12755 1727204147.90382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204147.95430: done with get_vars() 12755 1727204147.95472: done getting variables 12755 1727204147.95547: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Tuesday 24 September 2024 14:55:47 -0400 (0:00:00.566) 0:01:13.194 ***** 12755 1727204147.95926: entering _queue_task() for managed-node1/shell 12755 1727204147.97103: worker is 1 (out of 1 available) 12755 1727204147.97119: exiting _queue_task() for managed-node1/shell 12755 1727204147.97132: done queuing things up, now waiting for results queue to drain 12755 1727204147.97134: waiting for pending results... 12755 1727204147.97432: running TaskExecutor() for managed-node1/TASK: Stop dnsmasq/radvd services 12755 1727204147.97825: in run() - task 12b410aa-8751-72e9-1a19-0000000001b6 12755 1727204147.97874: variable 'ansible_search_path' from source: unknown 12755 1727204147.98071: variable 'ansible_search_path' from source: unknown 12755 1727204147.98076: calling self._execute() 12755 1727204147.98244: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204147.98301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204147.98320: variable 'omit' from source: magic vars 12755 1727204147.99308: variable 'ansible_distribution_major_version' from source: facts 12755 1727204147.99333: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204147.99427: variable 'omit' from source: magic vars 12755 1727204147.99545: variable 'omit' from source: magic vars 12755 1727204147.99753: variable 'omit' from source: magic vars 12755 1727204147.99763: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204147.99815: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204147.99970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204147.99973: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204148.00039: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204148.00188: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204148.00194: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204148.00197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204148.00407: Set connection var ansible_connection to ssh 12755 1727204148.00423: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204148.00474: Set connection var ansible_shell_type to sh 12755 1727204148.00499: Set connection var ansible_timeout to 10 12755 1727204148.00624: Set connection var ansible_shell_executable to /bin/sh 12755 1727204148.00628: Set connection var ansible_pipelining to False 12755 1727204148.00630: variable 'ansible_shell_executable' from source: unknown 12755 1727204148.00635: variable 'ansible_connection' from source: unknown 12755 1727204148.00644: variable 'ansible_module_compression' from source: unknown 12755 1727204148.00651: variable 'ansible_shell_type' from source: unknown 12755 1727204148.00694: variable 'ansible_shell_executable' from source: unknown 12755 1727204148.00704: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204148.00714: variable 'ansible_pipelining' from source: unknown 12755 1727204148.00723: variable 'ansible_timeout' from source: unknown 12755 1727204148.00738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204148.01125: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204148.01191: variable 'omit' from source: magic vars 12755 1727204148.01203: starting attempt loop 12755 1727204148.01212: running the handler 12755 1727204148.01247: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204148.01387: _low_level_execute_command(): starting 12755 1727204148.01392: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204148.03087: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204148.03107: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204148.03309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204148.05199: stdout chunk (state=3): >>>/root <<< 12755 1727204148.05353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204148.05478: stderr chunk (state=3): >>><<< 12755 1727204148.05741: stdout chunk (state=3): >>><<< 12755 1727204148.05994: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204148.05998: _low_level_execute_command(): starting 12755 1727204148.06002: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204148.0562792-17183-275167357892054 `" && echo ansible-tmp-1727204148.0562792-17183-275167357892054="` echo /root/.ansible/tmp/ansible-tmp-1727204148.0562792-17183-275167357892054 `" ) && sleep 0' 12755 1727204148.06845: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204148.06895: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204148.07004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 12755 1727204148.07112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204148.07129: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204148.07653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204148.09725: stdout chunk (state=3): >>>ansible-tmp-1727204148.0562792-17183-275167357892054=/root/.ansible/tmp/ansible-tmp-1727204148.0562792-17183-275167357892054 <<< 12755 1727204148.09818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204148.09869: stderr chunk (state=3): >>><<< 12755 1727204148.09880: stdout chunk (state=3): >>><<< 12755 1727204148.09932: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204148.0562792-17183-275167357892054=/root/.ansible/tmp/ansible-tmp-1727204148.0562792-17183-275167357892054 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204148.10299: variable 'ansible_module_compression' from source: unknown 12755 1727204148.10303: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12755 1727204148.10305: variable 'ansible_facts' from source: unknown 12755 1727204148.10439: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204148.0562792-17183-275167357892054/AnsiballZ_command.py 12755 1727204148.11020: Sending initial data 12755 1727204148.11031: Sent initial data (156 bytes) 12755 1727204148.12129: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204148.12309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204148.12410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204148.12438: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204148.12456: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204148.12541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204148.14373: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204148.15028: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpt1nkl7m8 /root/.ansible/tmp/ansible-tmp-1727204148.0562792-17183-275167357892054/AnsiballZ_command.py <<< 12755 1727204148.15033: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204148.0562792-17183-275167357892054/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpt1nkl7m8" to remote "/root/.ansible/tmp/ansible-tmp-1727204148.0562792-17183-275167357892054/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204148.0562792-17183-275167357892054/AnsiballZ_command.py" <<< 12755 1727204148.16845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204148.16965: stderr chunk (state=3): >>><<< 12755 1727204148.16978: stdout chunk (state=3): >>><<< 12755 1727204148.17010: done transferring module to remote 12755 1727204148.17161: _low_level_execute_command(): starting 12755 1727204148.17227: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204148.0562792-17183-275167357892054/ /root/.ansible/tmp/ansible-tmp-1727204148.0562792-17183-275167357892054/AnsiballZ_command.py && sleep 0' 12755 1727204148.18598: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204148.18657: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204148.18673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204148.18693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204148.18864: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204148.18985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204148.19066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204148.21236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204148.21240: stdout chunk (state=3): >>><<< 12755 1727204148.21249: stderr chunk (state=3): >>><<< 12755 1727204148.21350: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204148.21358: _low_level_execute_command(): starting 12755 1727204148.21361: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204148.0562792-17183-275167357892054/AnsiballZ_command.py && sleep 0' 12755 1727204148.22786: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204148.22854: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204148.22877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204148.22903: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204148.23018: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204148.23206: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204148.23226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204148.23388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204148.44431: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:55:48.412695", "end": "2024-09-24 14:55:48.443152", "delta": "0:00:00.030457", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12755 1727204148.46342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204148.46347: stdout chunk (state=3): >>><<< 12755 1727204148.46351: stderr chunk (state=3): >>><<< 12755 1727204148.46426: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:55:48.412695", "end": "2024-09-24 14:55:48.443152", "delta": "0:00:00.030457", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204148.46459: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204148.0562792-17183-275167357892054/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204148.46467: _low_level_execute_command(): starting 12755 1727204148.46474: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204148.0562792-17183-275167357892054/ > /dev/null 2>&1 && sleep 0' 12755 1727204148.46973: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204148.46977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 12755 1727204148.46980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 12755 1727204148.46982: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 12755 1727204148.46984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204148.47045: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204148.47048: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204148.47054: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204148.47103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204148.49295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204148.49299: stdout chunk (state=3): >>><<< 12755 1727204148.49302: stderr chunk (state=3): >>><<< 12755 1727204148.49305: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204148.49307: handler run complete 12755 1727204148.49309: Evaluated conditional (False): False 12755 1727204148.49319: attempt loop complete, returning result 12755 1727204148.49322: _execute() done 12755 1727204148.49329: dumping result to json 12755 1727204148.49337: done dumping result, returning 12755 1727204148.49348: done running TaskExecutor() for managed-node1/TASK: Stop dnsmasq/radvd services [12b410aa-8751-72e9-1a19-0000000001b6] 12755 1727204148.49354: sending task result for task 12b410aa-8751-72e9-1a19-0000000001b6 12755 1727204148.49503: done sending task result for task 12b410aa-8751-72e9-1a19-0000000001b6 12755 1727204148.49508: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.030457", "end": "2024-09-24 14:55:48.443152", "rc": 0, "start": "2024-09-24 14:55:48.412695" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 12755 1727204148.49639: no more pending results, returning what we have 12755 1727204148.49643: results queue empty 12755 1727204148.49644: checking for any_errors_fatal 12755 1727204148.49651: done checking for any_errors_fatal 12755 1727204148.49652: checking for max_fail_percentage 12755 1727204148.49654: done checking for max_fail_percentage 12755 1727204148.49655: checking to see if all hosts have failed and the running result is not ok 12755 1727204148.49656: done checking to see if all hosts have failed 12755 1727204148.49657: getting the remaining hosts for this loop 12755 1727204148.49658: done getting the remaining hosts for this loop 12755 1727204148.49663: getting the next task for host managed-node1 12755 1727204148.49674: done getting next task for host managed-node1 12755 1727204148.49676: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 12755 1727204148.49679: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204148.49684: getting variables 12755 1727204148.49686: in VariableManager get_vars() 12755 1727204148.49774: Calling all_inventory to load vars for managed-node1 12755 1727204148.49777: Calling groups_inventory to load vars for managed-node1 12755 1727204148.49780: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204148.49796: Calling all_plugins_play to load vars for managed-node1 12755 1727204148.49799: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204148.49803: Calling groups_plugins_play to load vars for managed-node1 12755 1727204148.51091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204148.54236: done with get_vars() 12755 1727204148.54291: done getting variables 12755 1727204148.54371: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:248 Tuesday 24 September 2024 14:55:48 -0400 (0:00:00.584) 0:01:13.779 ***** 12755 1727204148.54414: entering _queue_task() for managed-node1/command 12755 1727204148.54731: worker is 1 (out of 1 available) 12755 1727204148.54750: exiting _queue_task() for managed-node1/command 12755 1727204148.54766: done queuing things up, now waiting for results queue to drain 12755 1727204148.54767: waiting for pending results... 12755 1727204148.55075: running TaskExecutor() for managed-node1/TASK: Restore the /etc/resolv.conf for initscript 12755 1727204148.55182: in run() - task 12b410aa-8751-72e9-1a19-0000000001b7 12755 1727204148.55196: variable 'ansible_search_path' from source: unknown 12755 1727204148.55233: calling self._execute() 12755 1727204148.55326: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204148.55334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204148.55353: variable 'omit' from source: magic vars 12755 1727204148.55784: variable 'ansible_distribution_major_version' from source: facts 12755 1727204148.55789: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204148.55964: variable 'network_provider' from source: set_fact 12755 1727204148.55968: Evaluated conditional (network_provider == "initscripts"): False 12755 1727204148.55971: when evaluation is False, skipping this task 12755 1727204148.55974: _execute() done 12755 1727204148.55977: dumping result to json 12755 1727204148.55980: done dumping result, returning 12755 1727204148.55983: done running TaskExecutor() for managed-node1/TASK: Restore the /etc/resolv.conf for initscript [12b410aa-8751-72e9-1a19-0000000001b7] 12755 1727204148.55985: sending task result for task 12b410aa-8751-72e9-1a19-0000000001b7 12755 1727204148.56118: done sending task result for task 12b410aa-8751-72e9-1a19-0000000001b7 12755 1727204148.56122: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12755 1727204148.56192: no more pending results, returning what we have 12755 1727204148.56196: results queue empty 12755 1727204148.56197: checking for any_errors_fatal 12755 1727204148.56206: done checking for any_errors_fatal 12755 1727204148.56207: checking for max_fail_percentage 12755 1727204148.56211: done checking for max_fail_percentage 12755 1727204148.56212: checking to see if all hosts have failed and the running result is not ok 12755 1727204148.56213: done checking to see if all hosts have failed 12755 1727204148.56214: getting the remaining hosts for this loop 12755 1727204148.56216: done getting the remaining hosts for this loop 12755 1727204148.56220: getting the next task for host managed-node1 12755 1727204148.56227: done getting next task for host managed-node1 12755 1727204148.56230: ^ task is: TASK: Verify network state restored to default 12755 1727204148.56234: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204148.56238: getting variables 12755 1727204148.56239: in VariableManager get_vars() 12755 1727204148.56293: Calling all_inventory to load vars for managed-node1 12755 1727204148.56297: Calling groups_inventory to load vars for managed-node1 12755 1727204148.56299: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204148.56312: Calling all_plugins_play to load vars for managed-node1 12755 1727204148.56316: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204148.56320: Calling groups_plugins_play to load vars for managed-node1 12755 1727204148.57952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204148.59563: done with get_vars() 12755 1727204148.59586: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:253 Tuesday 24 September 2024 14:55:48 -0400 (0:00:00.052) 0:01:13.832 ***** 12755 1727204148.59671: entering _queue_task() for managed-node1/include_tasks 12755 1727204148.59926: worker is 1 (out of 1 available) 12755 1727204148.59944: exiting _queue_task() for managed-node1/include_tasks 12755 1727204148.59958: done queuing things up, now waiting for results queue to drain 12755 1727204148.59960: waiting for pending results... 12755 1727204148.60227: running TaskExecutor() for managed-node1/TASK: Verify network state restored to default 12755 1727204148.60337: in run() - task 12b410aa-8751-72e9-1a19-0000000001b8 12755 1727204148.60351: variable 'ansible_search_path' from source: unknown 12755 1727204148.60452: calling self._execute() 12755 1727204148.60525: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204148.60533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204148.60549: variable 'omit' from source: magic vars 12755 1727204148.60960: variable 'ansible_distribution_major_version' from source: facts 12755 1727204148.60972: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204148.60979: _execute() done 12755 1727204148.60984: dumping result to json 12755 1727204148.60987: done dumping result, returning 12755 1727204148.60998: done running TaskExecutor() for managed-node1/TASK: Verify network state restored to default [12b410aa-8751-72e9-1a19-0000000001b8] 12755 1727204148.61006: sending task result for task 12b410aa-8751-72e9-1a19-0000000001b8 12755 1727204148.61129: done sending task result for task 12b410aa-8751-72e9-1a19-0000000001b8 12755 1727204148.61132: WORKER PROCESS EXITING 12755 1727204148.61198: no more pending results, returning what we have 12755 1727204148.61204: in VariableManager get_vars() 12755 1727204148.61264: Calling all_inventory to load vars for managed-node1 12755 1727204148.61268: Calling groups_inventory to load vars for managed-node1 12755 1727204148.61270: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204148.61281: Calling all_plugins_play to load vars for managed-node1 12755 1727204148.61284: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204148.61287: Calling groups_plugins_play to load vars for managed-node1 12755 1727204148.62560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204148.64336: done with get_vars() 12755 1727204148.64357: variable 'ansible_search_path' from source: unknown 12755 1727204148.64369: we have included files to process 12755 1727204148.64370: generating all_blocks data 12755 1727204148.64372: done generating all_blocks data 12755 1727204148.64377: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 12755 1727204148.64378: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 12755 1727204148.64380: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 12755 1727204148.64825: done processing included file 12755 1727204148.64826: iterating over new_blocks loaded from include file 12755 1727204148.64827: in VariableManager get_vars() 12755 1727204148.64849: done with get_vars() 12755 1727204148.64851: filtering new block on tags 12755 1727204148.64884: done filtering new block on tags 12755 1727204148.64887: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node1 12755 1727204148.64892: extending task lists for all hosts with included blocks 12755 1727204148.65980: done extending task lists 12755 1727204148.65982: done processing included files 12755 1727204148.65982: results queue empty 12755 1727204148.65983: checking for any_errors_fatal 12755 1727204148.65985: done checking for any_errors_fatal 12755 1727204148.65986: checking for max_fail_percentage 12755 1727204148.65987: done checking for max_fail_percentage 12755 1727204148.65987: checking to see if all hosts have failed and the running result is not ok 12755 1727204148.65988: done checking to see if all hosts have failed 12755 1727204148.65990: getting the remaining hosts for this loop 12755 1727204148.65991: done getting the remaining hosts for this loop 12755 1727204148.65993: getting the next task for host managed-node1 12755 1727204148.65997: done getting next task for host managed-node1 12755 1727204148.65999: ^ task is: TASK: Check routes and DNS 12755 1727204148.66001: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204148.66003: getting variables 12755 1727204148.66004: in VariableManager get_vars() 12755 1727204148.66021: Calling all_inventory to load vars for managed-node1 12755 1727204148.66023: Calling groups_inventory to load vars for managed-node1 12755 1727204148.66025: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204148.66030: Calling all_plugins_play to load vars for managed-node1 12755 1727204148.66032: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204148.66034: Calling groups_plugins_play to load vars for managed-node1 12755 1727204148.67499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204148.70550: done with get_vars() 12755 1727204148.70597: done getting variables 12755 1727204148.70659: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:55:48 -0400 (0:00:00.111) 0:01:13.944 ***** 12755 1727204148.70877: entering _queue_task() for managed-node1/shell 12755 1727204148.71282: worker is 1 (out of 1 available) 12755 1727204148.71501: exiting _queue_task() for managed-node1/shell 12755 1727204148.71513: done queuing things up, now waiting for results queue to drain 12755 1727204148.71515: waiting for pending results... 12755 1727204148.72051: running TaskExecutor() for managed-node1/TASK: Check routes and DNS 12755 1727204148.72084: in run() - task 12b410aa-8751-72e9-1a19-0000000009f0 12755 1727204148.72108: variable 'ansible_search_path' from source: unknown 12755 1727204148.72298: variable 'ansible_search_path' from source: unknown 12755 1727204148.72344: calling self._execute() 12755 1727204148.72349: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204148.72352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204148.72355: variable 'omit' from source: magic vars 12755 1727204148.72592: variable 'ansible_distribution_major_version' from source: facts 12755 1727204148.72597: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204148.72600: variable 'omit' from source: magic vars 12755 1727204148.72814: variable 'omit' from source: magic vars 12755 1727204148.72869: variable 'omit' from source: magic vars 12755 1727204148.72919: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12755 1727204148.73035: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12755 1727204148.73039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12755 1727204148.73042: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204148.73045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12755 1727204148.73165: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12755 1727204148.73170: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204148.73173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204148.73354: Set connection var ansible_connection to ssh 12755 1727204148.73357: Set connection var ansible_module_compression to ZIP_DEFLATED 12755 1727204148.73453: Set connection var ansible_shell_type to sh 12755 1727204148.73461: Set connection var ansible_timeout to 10 12755 1727204148.73513: Set connection var ansible_shell_executable to /bin/sh 12755 1727204148.73520: Set connection var ansible_pipelining to False 12755 1727204148.73523: variable 'ansible_shell_executable' from source: unknown 12755 1727204148.73525: variable 'ansible_connection' from source: unknown 12755 1727204148.73532: variable 'ansible_module_compression' from source: unknown 12755 1727204148.73535: variable 'ansible_shell_type' from source: unknown 12755 1727204148.73538: variable 'ansible_shell_executable' from source: unknown 12755 1727204148.73540: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204148.73543: variable 'ansible_pipelining' from source: unknown 12755 1727204148.73545: variable 'ansible_timeout' from source: unknown 12755 1727204148.73648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204148.73723: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204148.73730: variable 'omit' from source: magic vars 12755 1727204148.73737: starting attempt loop 12755 1727204148.73741: running the handler 12755 1727204148.73744: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12755 1727204148.73748: _low_level_execute_command(): starting 12755 1727204148.73755: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12755 1727204148.74640: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204148.74643: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204148.74660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204148.74672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204148.74676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204148.74727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204148.74731: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204148.74764: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204148.74919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204148.76802: stdout chunk (state=3): >>>/root <<< 12755 1727204148.76913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204148.76955: stderr chunk (state=3): >>><<< 12755 1727204148.76964: stdout chunk (state=3): >>><<< 12755 1727204148.76987: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204148.77002: _low_level_execute_command(): starting 12755 1727204148.77008: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204148.769862-17214-151510679714336 `" && echo ansible-tmp-1727204148.769862-17214-151510679714336="` echo /root/.ansible/tmp/ansible-tmp-1727204148.769862-17214-151510679714336 `" ) && sleep 0' 12755 1727204148.77456: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204148.77466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204148.77469: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204148.77472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204148.77533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204148.77536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204148.77627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204148.80018: stdout chunk (state=3): >>>ansible-tmp-1727204148.769862-17214-151510679714336=/root/.ansible/tmp/ansible-tmp-1727204148.769862-17214-151510679714336 <<< 12755 1727204148.80022: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204148.80028: stdout chunk (state=3): >>><<< 12755 1727204148.80039: stderr chunk (state=3): >>><<< 12755 1727204148.80054: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204148.769862-17214-151510679714336=/root/.ansible/tmp/ansible-tmp-1727204148.769862-17214-151510679714336 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204148.80258: variable 'ansible_module_compression' from source: unknown 12755 1727204148.80262: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12755n096dwm_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12755 1727204148.80305: variable 'ansible_facts' from source: unknown 12755 1727204148.80406: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204148.769862-17214-151510679714336/AnsiballZ_command.py 12755 1727204148.80619: Sending initial data 12755 1727204148.80623: Sent initial data (155 bytes) 12755 1727204148.81212: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204148.81223: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204148.81237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204148.81307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204148.81344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204148.81358: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204148.81367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204148.81456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204148.83355: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12755 1727204148.83361: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12755 1727204148.83456: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12755 1727204148.83544: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12755n096dwm_/tmpyrij0jmp /root/.ansible/tmp/ansible-tmp-1727204148.769862-17214-151510679714336/AnsiballZ_command.py <<< 12755 1727204148.83569: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204148.769862-17214-151510679714336/AnsiballZ_command.py" <<< 12755 1727204148.83573: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-12755n096dwm_/tmpyrij0jmp" to remote "/root/.ansible/tmp/ansible-tmp-1727204148.769862-17214-151510679714336/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204148.769862-17214-151510679714336/AnsiballZ_command.py" <<< 12755 1727204148.84744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204148.84831: stderr chunk (state=3): >>><<< 12755 1727204148.84984: stdout chunk (state=3): >>><<< 12755 1727204148.84988: done transferring module to remote 12755 1727204148.84995: _low_level_execute_command(): starting 12755 1727204148.84998: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204148.769862-17214-151510679714336/ /root/.ansible/tmp/ansible-tmp-1727204148.769862-17214-151510679714336/AnsiballZ_command.py && sleep 0' 12755 1727204148.85682: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204148.85686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204148.85744: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204148.85769: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204148.85837: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12755 1727204148.85883: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204148.85922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204148.85988: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204148.88097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204148.88101: stdout chunk (state=3): >>><<< 12755 1727204148.88103: stderr chunk (state=3): >>><<< 12755 1727204148.88106: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204148.88108: _low_level_execute_command(): starting 12755 1727204148.88114: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204148.769862-17214-151510679714336/AnsiballZ_command.py && sleep 0' 12755 1727204148.88788: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204148.88795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204148.88798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204148.88801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204148.88803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 12755 1727204148.88995: stderr chunk (state=3): >>>debug2: match not found <<< 12755 1727204148.88998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204148.89001: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12755 1727204148.89004: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 12755 1727204148.89006: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12755 1727204148.89008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12755 1727204148.89015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12755 1727204148.89018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12755 1727204148.89020: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12755 1727204148.89022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204148.89024: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204148.89075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204149.08069: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:d4:45:6e:f8:dd brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.210/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2979sec preferred_lft 2979sec\n inet6 fe80::d080:f60d:659:9515/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.210 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.210 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:55:49.070007", "end": "2024-09-24 14:55:49.079438", "delta": "0:00:00.009431", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12755 1727204149.09923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 12755 1727204149.09956: stderr chunk (state=3): >>><<< 12755 1727204149.09968: stdout chunk (state=3): >>><<< 12755 1727204149.10003: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:d4:45:6e:f8:dd brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.210/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2979sec preferred_lft 2979sec\n inet6 fe80::d080:f60d:659:9515/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.210 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.210 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:55:49.070007", "end": "2024-09-24 14:55:49.079438", "delta": "0:00:00.009431", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 12755 1727204149.10094: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204148.769862-17214-151510679714336/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12755 1727204149.10112: _low_level_execute_command(): starting 12755 1727204149.10195: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204148.769862-17214-151510679714336/ > /dev/null 2>&1 && sleep 0' 12755 1727204149.10848: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 12755 1727204149.10973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12755 1727204149.11023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12755 1727204149.11068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12755 1727204149.13237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12755 1727204149.13252: stderr chunk (state=3): >>><<< 12755 1727204149.13255: stdout chunk (state=3): >>><<< 12755 1727204149.13271: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12755 1727204149.13279: handler run complete 12755 1727204149.13317: Evaluated conditional (False): False 12755 1727204149.13330: attempt loop complete, returning result 12755 1727204149.13333: _execute() done 12755 1727204149.13337: dumping result to json 12755 1727204149.13344: done dumping result, returning 12755 1727204149.13353: done running TaskExecutor() for managed-node1/TASK: Check routes and DNS [12b410aa-8751-72e9-1a19-0000000009f0] 12755 1727204149.13357: sending task result for task 12b410aa-8751-72e9-1a19-0000000009f0 12755 1727204149.13482: done sending task result for task 12b410aa-8751-72e9-1a19-0000000009f0 12755 1727204149.13485: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009431", "end": "2024-09-24 14:55:49.079438", "rc": 0, "start": "2024-09-24 14:55:49.070007" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:d4:45:6e:f8:dd brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.11.210/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 2979sec preferred_lft 2979sec inet6 fe80::d080:f60d:659:9515/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.210 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.210 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 12755 1727204149.13606: no more pending results, returning what we have 12755 1727204149.13610: results queue empty 12755 1727204149.13612: checking for any_errors_fatal 12755 1727204149.13614: done checking for any_errors_fatal 12755 1727204149.13615: checking for max_fail_percentage 12755 1727204149.13617: done checking for max_fail_percentage 12755 1727204149.13618: checking to see if all hosts have failed and the running result is not ok 12755 1727204149.13619: done checking to see if all hosts have failed 12755 1727204149.13620: getting the remaining hosts for this loop 12755 1727204149.13626: done getting the remaining hosts for this loop 12755 1727204149.13632: getting the next task for host managed-node1 12755 1727204149.13640: done getting next task for host managed-node1 12755 1727204149.13643: ^ task is: TASK: Verify DNS and network connectivity 12755 1727204149.13647: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12755 1727204149.13656: getting variables 12755 1727204149.13658: in VariableManager get_vars() 12755 1727204149.13721: Calling all_inventory to load vars for managed-node1 12755 1727204149.13725: Calling groups_inventory to load vars for managed-node1 12755 1727204149.13728: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204149.13740: Calling all_plugins_play to load vars for managed-node1 12755 1727204149.13744: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204149.13748: Calling groups_plugins_play to load vars for managed-node1 12755 1727204149.19768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204149.21385: done with get_vars() 12755 1727204149.21425: done getting variables 12755 1727204149.21485: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:55:49 -0400 (0:00:00.506) 0:01:14.450 ***** 12755 1727204149.21521: entering _queue_task() for managed-node1/shell 12755 1727204149.21903: worker is 1 (out of 1 available) 12755 1727204149.21917: exiting _queue_task() for managed-node1/shell 12755 1727204149.21931: done queuing things up, now waiting for results queue to drain 12755 1727204149.21934: waiting for pending results... 12755 1727204149.22416: running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity 12755 1727204149.22431: in run() - task 12b410aa-8751-72e9-1a19-0000000009f1 12755 1727204149.22465: variable 'ansible_search_path' from source: unknown 12755 1727204149.22469: variable 'ansible_search_path' from source: unknown 12755 1727204149.22517: calling self._execute() 12755 1727204149.22608: variable 'ansible_host' from source: host vars for 'managed-node1' 12755 1727204149.22618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12755 1727204149.22630: variable 'omit' from source: magic vars 12755 1727204149.22984: variable 'ansible_distribution_major_version' from source: facts 12755 1727204149.22997: Evaluated conditional (ansible_distribution_major_version != '6'): True 12755 1727204149.23120: variable 'ansible_facts' from source: unknown 12755 1727204149.23825: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 12755 1727204149.23829: when evaluation is False, skipping this task 12755 1727204149.23833: _execute() done 12755 1727204149.23836: dumping result to json 12755 1727204149.23839: done dumping result, returning 12755 1727204149.23850: done running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity [12b410aa-8751-72e9-1a19-0000000009f1] 12755 1727204149.23853: sending task result for task 12b410aa-8751-72e9-1a19-0000000009f1 12755 1727204149.23953: done sending task result for task 12b410aa-8751-72e9-1a19-0000000009f1 12755 1727204149.23955: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 12755 1727204149.24008: no more pending results, returning what we have 12755 1727204149.24014: results queue empty 12755 1727204149.24015: checking for any_errors_fatal 12755 1727204149.24033: done checking for any_errors_fatal 12755 1727204149.24033: checking for max_fail_percentage 12755 1727204149.24035: done checking for max_fail_percentage 12755 1727204149.24036: checking to see if all hosts have failed and the running result is not ok 12755 1727204149.24038: done checking to see if all hosts have failed 12755 1727204149.24038: getting the remaining hosts for this loop 12755 1727204149.24040: done getting the remaining hosts for this loop 12755 1727204149.24045: getting the next task for host managed-node1 12755 1727204149.24057: done getting next task for host managed-node1 12755 1727204149.24059: ^ task is: TASK: meta (flush_handlers) 12755 1727204149.24061: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204149.24068: getting variables 12755 1727204149.24069: in VariableManager get_vars() 12755 1727204149.24129: Calling all_inventory to load vars for managed-node1 12755 1727204149.24132: Calling groups_inventory to load vars for managed-node1 12755 1727204149.24135: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204149.24149: Calling all_plugins_play to load vars for managed-node1 12755 1727204149.24152: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204149.24156: Calling groups_plugins_play to load vars for managed-node1 12755 1727204149.25388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204149.27499: done with get_vars() 12755 1727204149.27526: done getting variables 12755 1727204149.27601: in VariableManager get_vars() 12755 1727204149.27629: Calling all_inventory to load vars for managed-node1 12755 1727204149.27634: Calling groups_inventory to load vars for managed-node1 12755 1727204149.27637: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204149.27647: Calling all_plugins_play to load vars for managed-node1 12755 1727204149.27650: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204149.27654: Calling groups_plugins_play to load vars for managed-node1 12755 1727204149.29064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204149.31266: done with get_vars() 12755 1727204149.31301: done queuing things up, now waiting for results queue to drain 12755 1727204149.31303: results queue empty 12755 1727204149.31304: checking for any_errors_fatal 12755 1727204149.31308: done checking for any_errors_fatal 12755 1727204149.31309: checking for max_fail_percentage 12755 1727204149.31310: done checking for max_fail_percentage 12755 1727204149.31311: checking to see if all hosts have failed and the running result is not ok 12755 1727204149.31312: done checking to see if all hosts have failed 12755 1727204149.31313: getting the remaining hosts for this loop 12755 1727204149.31314: done getting the remaining hosts for this loop 12755 1727204149.31317: getting the next task for host managed-node1 12755 1727204149.31322: done getting next task for host managed-node1 12755 1727204149.31324: ^ task is: TASK: meta (flush_handlers) 12755 1727204149.31326: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204149.31329: getting variables 12755 1727204149.31330: in VariableManager get_vars() 12755 1727204149.31349: Calling all_inventory to load vars for managed-node1 12755 1727204149.31351: Calling groups_inventory to load vars for managed-node1 12755 1727204149.31353: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204149.31357: Calling all_plugins_play to load vars for managed-node1 12755 1727204149.31359: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204149.31361: Calling groups_plugins_play to load vars for managed-node1 12755 1727204149.32829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204149.34701: done with get_vars() 12755 1727204149.34723: done getting variables 12755 1727204149.34769: in VariableManager get_vars() 12755 1727204149.34788: Calling all_inventory to load vars for managed-node1 12755 1727204149.34792: Calling groups_inventory to load vars for managed-node1 12755 1727204149.34794: Calling all_plugins_inventory to load vars for managed-node1 12755 1727204149.34798: Calling all_plugins_play to load vars for managed-node1 12755 1727204149.34800: Calling groups_plugins_inventory to load vars for managed-node1 12755 1727204149.34802: Calling groups_plugins_play to load vars for managed-node1 12755 1727204149.35887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12755 1727204149.37611: done with get_vars() 12755 1727204149.37636: done queuing things up, now waiting for results queue to drain 12755 1727204149.37638: results queue empty 12755 1727204149.37638: checking for any_errors_fatal 12755 1727204149.37639: done checking for any_errors_fatal 12755 1727204149.37640: checking for max_fail_percentage 12755 1727204149.37641: done checking for max_fail_percentage 12755 1727204149.37641: checking to see if all hosts have failed and the running result is not ok 12755 1727204149.37642: done checking to see if all hosts have failed 12755 1727204149.37643: getting the remaining hosts for this loop 12755 1727204149.37643: done getting the remaining hosts for this loop 12755 1727204149.37652: getting the next task for host managed-node1 12755 1727204149.37655: done getting next task for host managed-node1 12755 1727204149.37655: ^ task is: None 12755 1727204149.37657: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12755 1727204149.37657: done queuing things up, now waiting for results queue to drain 12755 1727204149.37658: results queue empty 12755 1727204149.37659: checking for any_errors_fatal 12755 1727204149.37659: done checking for any_errors_fatal 12755 1727204149.37660: checking for max_fail_percentage 12755 1727204149.37660: done checking for max_fail_percentage 12755 1727204149.37661: checking to see if all hosts have failed and the running result is not ok 12755 1727204149.37661: done checking to see if all hosts have failed 12755 1727204149.37664: getting the next task for host managed-node1 12755 1727204149.37666: done getting next task for host managed-node1 12755 1727204149.37667: ^ task is: None 12755 1727204149.37667: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node1 : ok=108 changed=5 unreachable=0 failed=0 skipped=121 rescued=0 ignored=0 Tuesday 24 September 2024 14:55:49 -0400 (0:00:00.162) 0:01:14.613 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.94s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.88s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.60s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.57s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.56s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install dnsmasq --------------------------------------------------------- 2.08s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Create test interfaces -------------------------------------------------- 2.06s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Install pgrep, sysctl --------------------------------------------------- 1.83s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Gathering Facts --------------------------------------------------------- 1.61s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:6 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.51s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.43s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.43s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 1.42s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.31s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.30s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.25s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 1.21s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.14s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.11s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 1.06s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:3 12755 1727204149.37799: RUNNING CLEANUP